ERIC Educational Resources Information Center
Riccomini, Paul J.; Morano, Stephanie; Hughes, Charles A.
2017-01-01
It is understandable that misuse of the terms "specially designed instruction" (SDI), "high-leverage practices" (HLPs), "explicit instruction" (EI), and "intensive instruction"(II) has bred confusion among professionals, and this confusion may lead to miscommunication and misunderstandings in the field.…
1995-09-01
vital processes of a business. process, IDEF, method, methodology, modeling, knowledge acquisition, requirements definition, information systems... knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems
Information Integration for Concurrent Engineering (IICE) Compendium of Methods Report
1995-06-01
technological, economic, and strategic benefits can be attained through the effective capture, control, and management of information and knowledge ...resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to achieve...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems that
Binding Leverage as a Molecular Basis for Allosteric Regulation
Mitternacht, Simon; Berezovsky, Igor N.
2011-01-01
Allosteric regulation involves conformational transitions or fluctuations between a few closely related states, caused by the binding of effector molecules. We introduce a quantity called binding leverage that measures the ability of a binding site to couple to the intrinsic motions of a protein. We use Monte Carlo simulations to generate potential binding sites and either normal modes or pairs of crystal structures to describe relevant motions. We analyze single catalytic domains and multimeric allosteric enzymes with complex regulation. For the majority of the analyzed proteins, we find that both catalytic and allosteric sites have high binding leverage. Furthermore, our analysis of the catabolite activator protein, which is allosteric without conformational change, shows that its regulation involves other types of motion than those modulated at sites with high binding leverage. Our results point to the importance of incorporating dynamic information when predicting functional sites. Because it is possible to calculate binding leverage from a single crystal structure it can be used for characterizing proteins of unknown function and predicting latent allosteric sites in any protein, with implications for drug design. PMID:21935347
NASA Technical Reports Server (NTRS)
Bosanac, Natasha; Cox, Andrew; Howell, Kathleen C.; Folta, David C.
2017-01-01
Lunar IceCube is a 6U CubeSat that is designed to detect and observe lunar volatiles from a highly inclined orbit. This spacecraft, equipped with a low-thrust engine, will be deployed from the upcoming Exploration Mission-1 vehicle in late 2018. However, significant uncertainty in the deployment conditions for secondary payloads impacts both the availability and geometry of transfers that deliver the spacecraft to the lunar vicinity. A framework that leverages dynamical systems techniques is applied to a recently updated set of deployment conditions and spacecraft parameter values for the Lunar IceCube mission, demonstrating the capability for rapid trajectory design.
NASA Astrophysics Data System (ADS)
Bosanac, Natasha; Cox, Andrew D.; Howell, Kathleen C.; Folta, David C.
2018-03-01
Lunar IceCube is a 6U CubeSat that is designed to detect and observe lunar volatiles from a highly inclined orbit. This spacecraft, equipped with a low-thrust engine, is expected to be deployed from the upcoming Exploration Mission-1 vehicle. However, significant uncertainty in the deployment conditions for secondary payloads impacts both the availability and geometry of transfers that deliver the spacecraft to the lunar vicinity. A framework that leverages dynamical systems techniques is applied to a recently updated set of deployment conditions and spacecraft parameter values for the Lunar IceCube mission, demonstrating the capability for rapid trajectory design.
NREL's Research Support Facility Certified LEED® Platinum | News | NREL
to sustainable building design and construction. At 222,000 square-feet, the RSF is a model for sustainable, high performance building design that leverages the best in energy efficiency and environmental energy use in commercial buildings that were incorporated in the design of the RSF. NREL researchers are
ERIC Educational Resources Information Center
Capobianco, Brenda M.; DeLisi, Jacqueline; Radloff, Jeffrey
2018-01-01
In an effort to document teachers' enactments of new reform in science teaching, valid and scalable measures of science teaching using engineering design are needed. This study describes the development and testing of an approach for documenting and characterizing elementary science teachers' multiday enactments of engineering design-based science…
The Modern Research Data Portal: A Design Pattern for Networked, Data-Intensive Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chard, Kyle; Dart, Eli; Foster, Ian
Here we describe best practices for providing convenient, high-speed, secure access to large data via research data portals. We capture these best practices in a new design pattern, the Modern Research Data Portal, that disaggregates the traditional monolithic web-based data portal to achieve orders-of-magnitude increases in data transfer performance, support new deployment architectures that decouple control logic from data storage, and reduce development and operations costs. We introduce the design pattern; explain how it leverages high-performance Science DMZs and cloud-based data management services; review representative examples at research laboratories and universities, including both experimental facilities and supercomputer sites; describe howmore » to leverage Python APIs for authentication, authorization, data transfer, and data sharing; and use coding examples to demonstrate how these APIs can be used to implement a range of research data portal capabilities. Sample code at a companion web site, https://docs.globus.org/mrdp, provides application skeletons that readers can adapt to realize their own research data portals.« less
The Modern Research Data Portal: a design pattern for networked, data-intensive science
Chard, Kyle; Dart, Eli; Foster, Ian; ...
2018-01-15
We describe best practices for providing convenient, high-speed, secure access to large data via research data portals. Here, we capture these best practices in a new design pattern, the Modern Research Data Portal, that disaggregates the traditional monolithic web-based data portal to achieve orders-of-magnitude increases in data transfer performance, support new deployment architectures that decouple control logic from data storage, and reduce development and operations costs. We introduce the design pattern; explain how it leverages high-performance data enclaves and cloud-based data management services; review representative examples at research laboratories and universities, including both experimental facilities and supercomputer sites; describe howmore » to leverage Python APIs for authentication, authorization, data transfer, and data sharing; and use coding examples to demonstrate how these APIs can be used to implement a range of research data portal capabilities. Sample code at a companion web site,https://docs.globus.org/mrdp, provides application skeletons that readers can adapt to realize their own research data portals.« less
The Modern Research Data Portal: a design pattern for networked, data-intensive science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chard, Kyle; Dart, Eli; Foster, Ian
We describe best practices for providing convenient, high-speed, secure access to large data via research data portals. Here, we capture these best practices in a new design pattern, the Modern Research Data Portal, that disaggregates the traditional monolithic web-based data portal to achieve orders-of-magnitude increases in data transfer performance, support new deployment architectures that decouple control logic from data storage, and reduce development and operations costs. We introduce the design pattern; explain how it leverages high-performance data enclaves and cloud-based data management services; review representative examples at research laboratories and universities, including both experimental facilities and supercomputer sites; describe howmore » to leverage Python APIs for authentication, authorization, data transfer, and data sharing; and use coding examples to demonstrate how these APIs can be used to implement a range of research data portal capabilities. Sample code at a companion web site,https://docs.globus.org/mrdp, provides application skeletons that readers can adapt to realize their own research data portals.« less
Crowdteaching: Supporting Teaching as Designing in Collective Intelligence Communities
ERIC Educational Resources Information Center
Recker, Mimi; Yuan, Min; Ye, Lei
2014-01-01
The widespread availability of high-quality Web-based content offers new potential for supporting teachers as designers of curricula and classroom activities. When coupled with a participatory Web culture and infrastructure, teachers can share their creations as well as leverage from the best that their peers have to offer to support a collective…
Technological Leverage in Higher Education: An Evolving Pedagogy
ERIC Educational Resources Information Center
Pillai, K. Rajasekharan; Prakash, Ashish Viswanath
2017-01-01
Purpose: The purpose of the study is to analyse the perception of students toward a computer-based exam on a custom-made digital device and their willingness to adopt the same for high-stake summative assessment. Design/methodology/approach: This study followed an analytical methodology using survey design. A modified version of students'…
Operability driven space system concept with high leverage technologies
NASA Astrophysics Data System (ADS)
Woo, Henry H.
1997-01-01
One of the common objectives of future launch and space transfer systems is to achieve low-cost and effective operational capability by automating processes from pre-launch to the end of mission. Hierarchical and integrated mission management, system management, autonomous GN&C, and integrated micro-nano avionics technologies are critical to extend or revitalize the exploitation of space. Essential to space transfer, orbital systems, Earth-To-Orbit (ETO), commercial and military aviation, and planetary systems are these high leverage hardware and software technologies. This paper covers the driving issues, goals, and requirements definition supported with typical concepts and utilization of multi-use technologies. The approach and method results in a practical system architecture and lower level design concepts.
State-Level High School Improvement Systems Checklist
ERIC Educational Resources Information Center
National High School Center, 2007
2007-01-01
This checklist is designed to help states at various stages develop their system of support to reach struggling high schools. The checklist can be used to assess where your state is in terms of the elements of using existing support and guidance mechanisms, and reconfiguring and/or creating new structures to leverage system change for high school…
Leveraging Sociocultural Theory to Create a Mentorship Program for Doctoral Students
ERIC Educational Resources Information Center
Crosslin, Matt; Wakefield, Jenny S.; Bennette, Phyllis; Black, James William, III
2013-01-01
This paper details a proposed doctoral student connections program that is based on sociocultural theory. It is designed to assist new students with starting their educational journey. This program is designed to leverage social interactions, peer mentorship, personal reflection, purposeful planning, and existing resources to assist students in…
Leveraging Graduate Education for a More Relevant Future
ERIC Educational Resources Information Center
Davis, Meredith
2012-01-01
Arguing that the 21st century context for design is significantly different from the previous century, a set of structural suggestions are posed that can leverage change. Administrative arrangements are questioned along with the lack of clear differentiation or performance expectation among design degrees. While widespread, confusing and…
NASA Technical Reports Server (NTRS)
Steele, John W.; Rector, Tony; Bue, Grant C.; Campbell, Colin; Makinen, Janice
2012-01-01
A dual-bed device to maintain the water quality of the Advanced Spacesuit Water Membrane Evaporation (SWME) water recirculation loop has been designed and is undergoing testing. The SWME is a heat rejection device under development at the NASA Johnson Space Center to perform thermal control for advanced spacesuits. One advantage to this technology is the potential for a significantly greater degree of tolerance to contamination when compared to the existing sublimator technology. The driver for the development of a water recirculation maintenance device is to further enhance this advantage through the leveraging of fluid loop management lessons learned from the International Space Station (ISS). A bed design that was developed for a Hamilton Sundstrand military application, and considered for a potential ISS application with the Urine Processor Assembly, provides a low pressure drop means for water maintenance in a recirculation loop. The bed design is coupled with high-capacity ion exchange resins, organic adsorbents, and a cyclic methodology developed for the Extravehicular Mobility Unit Transport Water Loop. The bed design further leverages a sorbent developed for the ISS that introduces a biocide in a microgravity-compatible manner for the Internal Active Thermal Control System. The leveraging of these water maintenance technologies to the SWME recirculation loop is a unique demonstration of applying the valuable lessons learned on the ISS to the next generation of crewed spaceflight Environmental Control and Life Support System hardware.
NASA Technical Reports Server (NTRS)
Steele, John W.; Rector, Tony; Bue, Grant C.; Campbell, Colin; Makinen, Janice
2011-01-01
A dual-bed device to maintain the water quality of the Advanced Spacesuit Water Membrane Evaporation (SWME) water recirculation loop has been designed and is undergoing testing. The SWME is a heat rejection device under development at the NASA Johnson Space Center to perform thermal control for advanced spacesuits. One advantage to this technology is the potential for a significantly greater degree of tolerance to contamination when compared to the existing Sublimator technology. The driver for the development of a water recirculation maintenance device is to further enhance this advantage through the leveraging of fluid loop management lessons-learned from the International Space Station (ISS). A bed design that was developed for a Hamilton Sundstrand military application, and considered for a potential ISS application with the Urine Processor Assembly, provides a low pressure drop means for water maintenance in a recirculation loop. The bed design is coupled with high capacity ion exchange resins, organic adsorbents, and a cyclic methodology developed for the Extravehicular Mobility Unit (EMU) Transport Water loop. The bed design further leverages a sorbent developed for ISS that introduces a biocide in a microgravity-compatible manner for the Internal Active Thermal Control System (IATCS). The leveraging of these water maintenance technologies to the SWME recirculation loop is a clear demonstration of applying the valuable lessons learned on the ISS to the next generation of manned spaceflight Environmental Control and Life Support System (ECLSS) hardware.
ERIC Educational Resources Information Center
Kunzler, Jayson S.
2012-01-01
This dissertation describes a research study designed to explore whether customization of online instruction results in improved learning in a college business statistics course. The study involved utilizing computer spreadsheet technology to develop an intelligent tutoring system (ITS) designed to: a) collect and monitor individual real-time…
Launch vehicle systems design analysis
NASA Technical Reports Server (NTRS)
Ryan, Robert; Verderaime, V.
1993-01-01
Current launch vehicle design emphasis is on low life-cycle cost. This paper applies total quality management (TQM) principles to a conventional systems design analysis process to provide low-cost, high-reliability designs. Suggested TQM techniques include Steward's systems information flow matrix method, quality leverage principle, quality through robustness and function deployment, Pareto's principle, Pugh's selection and enhancement criteria, and other design process procedures. TQM quality performance at least-cost can be realized through competent concurrent engineering teams and brilliance of their technical leadership.
Design and Characterization of a Novel Bio-inspired Hair Flow Sensor Based on Resonant Sensing
NASA Astrophysics Data System (ADS)
Guo, X.; Yang, B.; Wang, Q. H.; Lu, C. F.; Hu, D.
2018-03-01
Flow sensors inspired by the natural hair sensing mechanism have great prospect in the research of micro-autonomous system and technology (MAST) for the three-dimensional structure characteristics with high spatial and quality utilization. A novel bio-inspired hair flow sensor (BHFS) based on resonant sensing with a unique asymmetric design is presented in this paper. A hair transducer and a signal detector which is constituted of a two-stage micro-leverage mechanism and two symmetrical resonators (double ended tuning fork, DETF) are adopted to realize the high sensitivity to air flow. The sensitivity of the proposed BHFS is improved significantly than the published ones due to the high sensitivity of resonators and the higher amplification factor possessed by the two-stage micro-leverage mechanism. The standard deep dry silicon on glass (DDSOG) process is chosen to fabricate the proposed BHFS. The experiment result demonstrates that the fabricated BHFS has a mechanical sensitivity of 5.26 Hz/(m/s)2 at a resonant frequency of 22 kHz with the hair height of 6 mm.
ERIC Educational Resources Information Center
Suh, Jennifer
2010-01-01
The following study describes design research in an elementary school near the metropolitan D.C. area with a diverse student population. The goal of the project was to design tasks that leveraged technology and enhance the access to critical thinking in specific mathematical concepts: data analysis and probability. It highlights the opportunities…
NASA Astrophysics Data System (ADS)
dos Santos Fradinho, Jorge Miguel
2014-05-01
Our understanding of enterprise systems (ES) is gradually evolving towards a sense of design which leverages multidisciplinary bodies of knowledge that may bolster hybrid research designs and together further the characterisation of ES operation and performance. This article aims to contribute towards ES design theory with its hospital enterprise systems design (HESD) framework, which reflects a rich multidisciplinary literature and two in-depth hospital empirical cases from the US and UK. In doing so it leverages systems thinking principles and traditionally disparate bodies of knowledge to bolster the theoretical evolution and foundation of ES. A total of seven core ES design elements are identified and characterised with 24 main categories and 53 subcategories. In addition, it builds on recent work which suggests that hospital enterprises are comprised of multiple internal ES configurations which may generate different levels of performance. Multiple sources of evidence were collected including electronic medical records, 54 recorded interviews, observation, and internal documents. Both in-depth cases compare and contrast higher and lower performing ES configurations. Following literal replication across in-depth cases, this article concludes that hospital performance can be improved through an enriched understanding of hospital ES design.
Fostering learners' interaction with content: A learner-centered mobile device interface
NASA Astrophysics Data System (ADS)
Abdous, M.
2015-12-01
With the ever-increasing omnipresence of mobile devices in student life, leveraging smart devices to foster students' interaction with course content is critical. Following a learner-centered design iterative approach, we designed a mobile interface that may enable learners to access and interact with online course content efficiently and intuitively. Our design process leveraged recent technologies, such as bootstrap, Google's Material Design, HTML5, and JavaScript to design an intuitive, efficient, and portable mobile interface with a variety of built-in features, including context sensitive bookmarking, searching, progress tracking, captioning, and transcript display. The mobile interface also offers students the ability to ask context-related questions and to complete self-checks as they watch audio/video presentations. Our design process involved ongoing iterative feedback from learners, allowing us to refine and tweak the interface to provide learners with a unified experience across platforms and devices. The innovative combination of technologies built around well-structured and well-designed content seems to provide an effective learning experience to mobile learners. Early feedback indicates a high level of satisfaction with the interface's efficiency, intuitiveness, and robustness from both students and faculty.
NASA Technical Reports Server (NTRS)
Steele, John W.; Rector, Tony; Bue, Grant C.; Campbell, Colin; Makinen, Janice
2013-01-01
A dual-bed device to maintain the water quality of the Advanced Spacesuit Water Membrane Evaporation (SWME) water recirculation loop has been designed and is undergoing testing. The SWME is a heat rejection device under development at the NASA Johnson Space Center to perform thermal control for advanced spacesuits. One advantage to this technology is the potential for a significantly greater degree of tolerance to contamination when compared to the existing Sublimator technology. The driver for the development of a water recirculation maintenance device is to further enhance this advantage through the leveraging of fluid loop management lessons-learned from the International Space Station (ISS). A bed design that was developed for a Hamilton Sundstrand military application, and considered for a potential ISS application with the Urine Processor Assembly, provides a low pressure drop means for water maintenance in a recirculation loop. The bed design is coupled with high capacity ion exchange resins, organic adsorbents, and a cyclic methodology developed for the Extravehicular Mobility Unit (EMU) Transport Water loop. The bed design further leverages a sorbent developed for ISS that introduces a biocide in a microgravity-compatible manner for the Internal Active Thermal Control System (IATCS). The leveraging of these water maintenance technologies to the SWME recirculation loop is a unique demonstration of applying the valuable lessons learned on the ISS to the next generation of manned spaceflight Environmental Control and Life Support System (ECLSS) hardware.
USDA-ARS?s Scientific Manuscript database
Many studies leverage targeted whole genome sequencing (WGS) experiments in order to identify rare and causal variants within populations. As a natural consequence of experimental design, many of these surveys tend to sequence redundant haplotype segments due to high frequency in the base population...
Vortex wake control via smart structures technology
NASA Astrophysics Data System (ADS)
Quackenbush, Todd R.; Bilanin, Alan J.; McKillip, Robert M., Jr.
1996-05-01
Control of trailing vortex wakes is an important challenges for both military and civilian applications. This paper summarizes an assessment of the feasibility of mitigating adverse vortex wake effects using control surfaces actuated via Shape Memory Alloy (SMA) technology. The assessment involved a combined computational/design analysis that identified methods for introducing small secondary vortices to promote the deintensification of vortex wakes of submarines and aircraft. Computational analyses of wake breakup using this `vortex leveraging' strategy were undertaken, and showed dramatic increases in the dissipation rate of concentrated vortex wakes. This paper briefly summarizes these results and describes the preliminary design of actuation mechanisms for the deflectable surfaces that effect the required time-varying wake perturbations. These surfaces, which build on the high-force, high- deflection capabilities of SMA materials, are shown to be well suited for the very low frequency actuation requirements of the wake deintensification mission. The paper outlines the assessment of device performance capabilities and describes the sizing studies undertaken for full-scale Vortex Leveraging Tabs (VLTs) designed for use in hydrodynamic and aerodynamic applications. Results obtained to date indicate that the proposed VLTs can accelerate wake breakup by over a factor of three and can be implemented using deflectable surfaces actuated using SMAs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.
Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.
NASA Technical Reports Server (NTRS)
Smith, Jeffrey H.; Levin, Richard R.; Carpenter, Elisabeth J.
1990-01-01
The results are described of an application of multiattribute analysis to the evaluation of high leverage prototyping technologies in the automation and robotics (A and R) areas that might contribute to the Space Station (SS) Freedom baseline design. An implication is that high leverage prototyping is beneficial to the SS Freedom Program as a means for transferring technology from the advanced development program to the baseline program. The process also highlights the tradeoffs to be made between subsidizing high value, low risk technology development versus high value, high risk technology developments. Twenty one A and R Technology tasks spanning a diverse array of technical concepts were evaluated using multiattribute decision analysis. Because of large uncertainties associated with characterizing the technologies, the methodology was modified to incorporate uncertainty. Eight attributes affected the rankings: initial cost, operation cost, crew productivity, safety, resource requirements, growth potential, and spinoff potential. The four attributes of initial cost, operations cost, crew productivity, and safety affected the rankings the most.
Leveraging the Educational Outreach Efforts of Low-Cost Missions
NASA Technical Reports Server (NTRS)
Fisher, Diane K.; Leon, Nancy J.
2000-01-01
A small portion of the budget for every NASA mission must be devoted to education and public outreach. The question is, how can projects best leverage these funds to create a high-quality message and get it disseminated to the largest and most appropriate audience? This paper describes the approach taken by a small educational outreach team for NASA's New Millennium Program (NMP). The team's approach has been twofold: develop a highly desirable suite of products designed to appeal to, as well as enlighten, the target audience; then negotiate relationships with existing, often under-utilized channels for dissemination of these products. Starting with NMP missions as the base of support for these efforts, the team has invited participation by other missions. This approach has resulted in a richer and broader message, and has allowed the continuing development of the audience base.
Design Considerations for Integrating Twitter into an Online Course
ERIC Educational Resources Information Center
Rohr, Linda E.; Costello, Jane; Hawkins, Thomas
2015-01-01
While the use of Twitter for communication and assessment activities in online courses is not new, it has not been without its challenges. This is increasingly true of high enrolment courses. The use of a Twitter Evaluation application which leverages a Learning Management System's (LMS's) application programming interface (API) provides a…
"Truth," Interrupted: Leveraging Digital Media for Culturally Sustaining Education
ERIC Educational Resources Information Center
Buckley-Marudas, Mary Frances
2017-01-01
This inquiry into the digital discussion forums tied to two English classes in an urban public high school examines the potential of new media to honor the multicultural composition of classrooms and support teachers to design culturally sustaining pedagogies. Given the increasing significance of digital media as well as the growing diversity of…
Equitable Leadership on the Ground: Converging on High-Leverage Practices
ERIC Educational Resources Information Center
Galloway, Mollie K.; Ishimaru, Ann M.
2017-01-01
What would leadership standards look like if developed through a lens and language of equity? We engaged with a group of 40 researchers, practitioners, and community leaders recognized as having expertise on equity in education to address this question. Using a Delphi technique, an approach designed to elicit expert feedback and measure…
Multi-Step Attack Detection via Bayesian Modeling under Model Parameter Uncertainty
ERIC Educational Resources Information Center
Cole, Robert
2013-01-01
Organizations in all sectors of business have become highly dependent upon information systems for the conduct of business operations. Of necessity, these information systems are designed with many points of ingress, points of exposure that can be leveraged by a motivated attacker seeking to compromise the confidentiality, integrity or…
Leveraging the Experimental Method to Inform Solar Cell Design
ERIC Educational Resources Information Center
Rose, Mary Annette; Ribblett, Jason W.; Hershberger, Heather Nicole
2010-01-01
In this article, the underlying logic of experimentation is exemplified within the context of a photoelectrical experiment for students taking a high school engineering, technology, or chemistry class. Students assume the role of photochemists as they plan, fabricate, and experiment with a solar cell made of copper and an aqueous solution of…
Design and fabrication of a flexible substrate microelectrode array for brain machine interfaces.
Patrick, Erin; Ordonez, Matthew; Alba, Nicolas; Sanchez, Justin C; Nishida, Toshikazu
2006-01-01
We report a neural microelectrode array design that leverages the recording properties of conventional microwire electrode arrays with the additional features of precise control of the electrode geometries. Using microfabrication techniques, a neural probe array is fabricated that possesses a flexible polyimide-based cable. The performance of the design was tested with electrochemical impedance spectroscopy and in vivo studies. The gold-plated electrode site has an impedance value of 0.9 M Omega at 1 kHz. Acute neural recording provided high neuronal yields, peak-to-peak amplitudes (as high as 100 microV), and signal-to-noise ratios (27 dB).
Leveraging OpenStudio's Application Programming Interfaces: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, N.; Ball, B.; Goldwasser, D.
2013-11-01
OpenStudio development efforts have been focused on providing Application Programming Interfaces (APIs) where users are able to extend OpenStudio without the need to compile the open source libraries. This paper will discuss the basic purposes and functionalities of the core libraries that have been wrapped with APIs including the Building Model, Results Processing, Advanced Analysis, UncertaintyQuantification, and Data Interoperability through Translators. Several building energy modeling applications have been produced using OpenStudio's API and Software Development Kits (SDK) including the United States Department of Energy's Asset ScoreCalculator, a mobile-based audit tool, an energy design assistance reporting protocol, and a portfolio scalemore » incentive optimization analysismethodology. Each of these software applications will be discussed briefly and will describe how the APIs were leveraged for various uses including high-level modeling, data transformations from detailed building audits, error checking/quality assurance of models, and use of high-performance computing for mass simulations.« less
2012-10-01
library as a principal Requestor. The M3CT requestor is written in Java , leveraging the cross platform deployment capabilities needed for a broadly...each application to the Java programming language, the independently generated sources are wrapped with JNA or Groovy. The Java wrapping process...unlimited. Figure 13. Leveraging Languages Once the underlying product is available to the Java source as a library, the application leverages
ERIC Educational Resources Information Center
Gaudet, Cyndi; Annulis, Heather; Kmiec, John
2010-01-01
The Geospatial Technology Apprenticeship Program (GTAP) pilot was designed as a replicable and sustainable program to enhance workforce skills in geospatial technologies to best leverage a $30 billion market potential. The purpose of evaluating GTAP was to ensure that investment in this high-growth industry was adding value. Findings from this…
Using JWST Heritage to Enable a Future Large Ultra-Violet Optical Infrared Telescope
NASA Technical Reports Server (NTRS)
Feinberg, Lee
2016-01-01
To the extent it makes sense, leverage JWST knowledge, designs, architectures, GSE. Develop a scalable design reference mission (9.2 meter). Do just enough work to understand launch break points in aperture size. Demonstrate 10 pm stability is achievable on a design reference mission. Make design compatible with starshades. While segmented coronagraphs with high throughput and large bandpasses are important, make the system serviceable so you can evolve the instruments. Keep it room temperature to minimize the costs associated with cryo. Focus resources on the contrast problem. Start with the architecture and connect it to the technology needs.
ERIC Educational Resources Information Center
Prescott, Alexandra S.; Luippold-Roge, Genevieve P.; Gurman, Tilly A.
2016-01-01
Objective: Maya women in Guatemala are disproportionately affected by poverty and negative reproductive health outcomes. Although social networks are valued in many Indigenous cultures, few studies have explored whether health education programmes can leverage these networks to improve reproductive health and economic wellbeing. Design: This…
Leveraging the Technology du Jour for Overt and Covert Faculty Development
ERIC Educational Resources Information Center
Hagler, Debra; Kastenbaum, Beatrice; Brooks, Ruth; Morris, Brenda; Saewert, Karen J.
2013-01-01
Leveraging Educational Technology for Evidence-Based Practice (LET-EBP), a four year federally funded project, was designed to extend use of educational technologies in the prelicensure undergraduate nursing program of a large public research university. Faculty members supported through the project developed and integrated over 20…
Cryogenic ultra-high power infrared diode laser bars
NASA Astrophysics Data System (ADS)
Crump, Paul; Frevert, C.; Hösler, H.; Bugge, F.; Knigge, S.; Pittroff, W.; Erbert, G.; Tränkle, G.
2014-02-01
GaAs-based high power diode lasers are the most efficient source of optical energy, and are in wide use in industrial applications, either directly or as pump sources for other laser media. Increased output power per laser is required to enable new applications (increased optical power density) and to reduce cost (more output per component leads to lower cost in $/W). For example, laser bars in the 9xx nm wavelength range with the very highest power and efficiency are needed as pump sources for many high-energy-class solid-state laser systems. We here present latest performance progress using a novel design approach that leverages operation at temperatures below 0°C for increases in bar power and efficiency. We show experimentally that operation at -55°C increases conversion efficiency and suppresses thermal rollover, enabling peak quasi-continuous wave bar powers of Pout > 1.6 kW to be achieved (1.2 ms, 10 Hz), limited by the available current. The conversion efficiency at 1.6 kW is 53%. Following on from this demonstration work, the key open challenge is to develop designs that deliver higher efficiencies, targeting > 80% at 1.6 kW. We present an analysis of the limiting factors and show that low electrical resistance is crucial, meaning that long resonators and high fill factor are needed. We review also progress in epitaxial design developments that leverage low temperatures to enable both low resistance and high optical performance. Latest results will be presented, summarizing the impact on bar performance and options for further improvements to efficiency will also be reviewed.
Democratizing Authority in the Built Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andersen, Michael P; Kolb, John; Chen, Kaifei
Operating systems and applications in the built environment have relied upon central authorization and management mechanisms which restrict their scalability, especially with respect to administrative overhead. We propose a new set of primitives encompassing syndication, security, and service execution that unifies the management of applications and services across the built environment, while enabling participants to individually delegate privilege across multiple administrative domains with no loss of security or manageability. We show how to leverage a decentralized authorization syndication platform to extend the design of building operating systems beyond the single administrative domain of a building. The authorization system leveraged ismore » based on blockchain smart contracts to permit decentralized and democratized delegation of authorization without central trust. Upon this, a publish/subscribe syndication tier and a containerized service execution environment are constructed. Combined, these mechanisms solve problems of delegation, federation, device protection and service execution that arise throughout the built environment. We leverage a high-fidelity city-scale emulation to verify the scalability of the authorization tier, and briefly describe a prototypical democratized operating system for the built environment using this foundation.« less
Lamberti, J Steven; Russ, Ann; Cerulli, Catherine; Weisman, Robert L; Jacobowitz, David; Williams, Geoffrey C
2014-01-01
Legal leverage is broadly defined as the use of legal authority to promote treatment adherence. It is widely utilized within mental health courts, drug courts, mandated outpatient treatment programs, and other intervention strategies for individuals with mental illness or chemical dependency who have contact with the criminal justice system. Nonetheless, the ethics of using legal authority to promote treatment adherence remains a hotly debated issue within public and professional circles alike. While critics characterize legal leverage as a coercive form of social control that undermines personal autonomy, advocates contend that it supports autonomy because treatment strategies using legal leverage are designed to promote health and independence. Despite the controversy, there is little evidence regarding the impact of legal leverage on patient autonomy as experienced and expressed by patients themselves. This report presents findings from a qualitative study involving six focus groups with severely mentally ill outpatients who received legal leverage through three forensic assertive community treatment (FACT) programs in Northeastern, Midwestern, and West Coast cities. Findings are discussed in the context of the self-determination theory of human motivation, and practical implications for the use of legal leverage are considered.
NASA Astrophysics Data System (ADS)
Ariffin, Syaiba Balqish; Midi, Habshah
2014-06-01
This article is concerned with the performance of logistic ridge regression estimation technique in the presence of multicollinearity and high leverage points. In logistic regression, multicollinearity exists among predictors and in the information matrix. The maximum likelihood estimator suffers a huge setback in the presence of multicollinearity which cause regression estimates to have unduly large standard errors. To remedy this problem, a logistic ridge regression estimator is put forward. It is evident that the logistic ridge regression estimator outperforms the maximum likelihood approach for handling multicollinearity. The effect of high leverage points are then investigated on the performance of the logistic ridge regression estimator through real data set and simulation study. The findings signify that logistic ridge regression estimator fails to provide better parameter estimates in the presence of both high leverage points and multicollinearity.
GATE: Energy Efficient Vehicles for Sustainable Mobility-Project TI022- FinalReport
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rizzoni, Giorgio
Unique opportunity for industry to engage in original, highly leveraged precompetitive research in automotive and transportation systems, with focus on advanced propulsion systems; fuel economy; vehicle safety, connectivity and autonomy; and advanced driver assistance systems Additional benefits: prepare graduate students for future careers in automotive industry, reaching undergraduate students through capstone design and other project activities, focused recruitment events
Identification of novel IP receptor agonists using historical ligand biased chemical arrays.
McKeown, Stephen C; Charlton, Steven J; Cox, Brian; Fitch, Helen; Howson, Christopher D; Leblanc, Catherine; Meyer, Arndt; Rosethorne, Elizabeth M; Stanley, Emily
2014-05-15
By considering published structural information we have designed high throughput biaryl lipophilic acid arrays leveraging facile chemistry to expedite their synthesis. We rapidly identified multiple hits which were of suitable IP agonist potency. These relatively simple and strategically undecorated molecules present an ideal opportunity for optimization towards our target candidate profile. Copyright © 2014 Elsevier Ltd. All rights reserved.
Leveraging PBL and Game to Redesign an Introductory Course
ERIC Educational Resources Information Center
Warren, Scott J.; Dondlinger, Mary Jo; Jones, Greg; Whitworth, Cliff
2010-01-01
The purpose of this paper is to discuss one instructional design that leverages problem-based learning and game structures as a means of developing innovative higher education courses for students as responsive, lived experiences. This paper reviews a curricular redesign that stemmed from the evaluation of an introductory course in computer…
2014-03-27
fidelity. This pairing is accomplished through the use of a space mapping technique, which is a process where the design space of a lower fidelity model...is aligned a higher fidelity model. The intent of applying space mapping techniques to the field of surrogate construction is to leverage the
Optical isolation based on space-time engineered asymmetric photonic band gaps
NASA Astrophysics Data System (ADS)
Chamanara, Nima; Taravati, Sajjad; Deck-Léger, Zoé-Lise; Caloz, Christophe
2017-10-01
Nonreciprocal electromagnetic devices play a crucial role in modern microwave and optical technologies. Conventional methods for realizing such systems are incompatible with integrated circuits. With recent advances in integrated photonics, the need for efficient on-chip magnetless nonreciprocal devices has become more pressing than ever. This paper leverages space-time engineered asymmetric photonic band gaps to generate optical isolation. It shows that a properly designed space-time modulated slab is highly reflective/transparent for opposite directions of propagation. The corresponding design is magnetless, accommodates low modulation frequencies, and can achieve very high isolation levels. An experimental proof of concept at microwave frequencies is provided.
Revere, Debra; Dixon, Brian E; Hills, Rebecca; Williams, Jennifer L; Grannis, Shaun J
2014-01-01
Surveillance, or the systematic monitoring of disease within a population, is a cornerstone function of public health. Despite significant investment in information technologies (IT) to improve the public's health, health care providers continue to rely on manual, spontaneous reporting processes that can result in incomplete and delayed surveillance activities. Participatory design principles advocate including real users and stakeholders when designing an information system to ensure high ecological validity of the product, incorporate relevance and context into the design, reduce misconceptions designers can make due to insufficient domain expertise, and ultimately reduce barriers to adoption of the system. This paper focuses on the collaborative and informal participatory design process used to develop enhanced, IT-enabled reporting processes that leverage available electronic health records in a health information exchange to prepopulate notifiable-conditions report forms used by public health authorities. Over nine months, public health stakeholders, technical staff, and informatics researchers were engaged in a multiphase participatory design process that included public health stakeholder focus groups, investigator-engineering team meetings, public health survey and census regarding high-priority data elements, and codesign of exploratory prototypes and final form mock-ups. A number of state-mandated report fields that are not highly used or desirable for disease investigation were eliminated, which allowed engineers to repurpose form space for desired and high-priority data elements and improve the usability of the forms. Our participatory design process ensured that IT development was driven by end user expertise and needs, resulting in significant improvements to the layout and functionality of the reporting forms. In addition to informing report form development, engaging with public health end users and stakeholders through the participatory design process provided new insights into public health workflow and allowed the team to quickly triage user requests while managing user expectations within the realm of engineering possibilities. Engaging public health, engineering staff, and investigators in a shared codesigning process ensured that the new forms will not only meet real-life needs but will also support development of a product that will be adopted and, ultimately, improve communicable and infectious disease reporting by clinicians to public health.
NASA Technical Reports Server (NTRS)
Rector, Tony; Steele, John W.; Bue, Grant C.; Campbell, Colin; Makinen, Janice
2012-01-01
A water loop maintenance device and process to maintain the water quality of the Advanced Spacesuit Water Membrane Evaporation (SWME) water recirculation loop has been undergoing a performance evaluation. The SWME is a heat rejection device under development at the NASA Johnson Space Center to perform thermal control for advanced spacesuits. One advantage to this technology is the potential for a significantly greater degree of tolerance to contamination when compared to the existing Sublimator technology. The driver for the water recirculation maintenance device and process is to further enhance this advantage through the leveraging of fluid loop management lessons-learned from the International Space Station (ISS). A bed design that was developed for a Hamilton Sundstrand military application, and considered for a potential ISS application with the Urine Processor Assembly, provides a low pressure drop means for water maintenance in a recirculation loop. The bed design is coupled with high capacity ion exchange resins, organic adsorbents, and a cyclic methodology developed for the Extravehicular Mobility Unit (EMU) Transport Water loop. The maintenance process further leverages a sorbent developed for ISS that introduces a biocide in a microgravity-compatible manner for the Internal Active Thermal Control System (IATCS). The leveraging of these water maintenance technologies to the SWME recirculation loop is a unique demonstration of applying the valuable lessons learned on the ISS to the next generation of manned spaceflight Environmental Control and Life Support System (ECLSS) hardware. This
Leveraging knowledge engineering and machine learning for microbial bio-manufacturing.
Oyetunde, Tolutola; Bao, Forrest Sheng; Chen, Jiung-Wen; Martin, Hector Garcia; Tang, Yinjie J
2018-05-03
Genome scale modeling (GSM) predicts the performance of microbial workhorses and helps identify beneficial gene targets. GSM integrated with intracellular flux dynamics, omics, and thermodynamics have shown remarkable progress in both elucidating complex cellular phenomena and computational strain design (CSD). Nonetheless, these models still show high uncertainty due to a poor understanding of innate pathway regulations, metabolic burdens, and other factors (such as stress tolerance and metabolite channeling). Besides, the engineered hosts may have genetic mutations or non-genetic variations in bioreactor conditions and thus CSD rarely foresees fermentation rate and titer. Metabolic models play important role in design-build-test-learn cycles for strain improvement, and machine learning (ML) may provide a viable complementary approach for driving strain design and deciphering cellular processes. In order to develop quality ML models, knowledge engineering leverages and standardizes the wealth of information in literature (e.g., genomic/phenomic data, synthetic biology strategies, and bioprocess variables). Data driven frameworks can offer new constraints for mechanistic models to describe cellular regulations, to design pathways, to search gene targets, and to estimate fermentation titer/rate/yield under specified growth conditions (e.g., mixing, nutrients, and O 2 ). This review highlights the scope of information collections, database constructions, and machine learning techniques (such as deep learning and transfer learning), which may facilitate "Learn and Design" for strain development. Copyright © 2018. Published by Elsevier Inc.
Leveraging advances in biology to design biomaterials
NASA Astrophysics Data System (ADS)
Darnell, Max; Mooney, David J.
2017-12-01
Biomaterials have dramatically increased in functionality and complexity, allowing unprecedented control over the cells that interact with them. From these engineering advances arises the prospect of improved biomaterial-based therapies, yet practical constraints favour simplicity. Tools from the biology community are enabling high-resolution and high-throughput bioassays that, if incorporated into a biomaterial design framework, could help achieve unprecedented functionality while minimizing the complexity of designs by identifying the most important material parameters and biological outputs. However, to avoid data explosions and to effectively match the information content of an assay with the goal of the experiment, material screens and bioassays must be arranged in specific ways. By borrowing methods to design experiments and workflows from the bioprocess engineering community, we outline a framework for the incorporation of next-generation bioassays into biomaterials design to effectively optimize function while minimizing complexity. This framework can inspire biomaterials designs that maximize functionality and translatability.
NASA Astrophysics Data System (ADS)
Xin, Chen; Huang, Ji-Ping
2017-12-01
Agent-based modeling and controlled human experiments serve as two fundamental research methods in the field of econophysics. Agent-based modeling has been in development for over 20 years, but how to design virtual agents with high levels of human-like "intelligence" remains a challenge. On the other hand, experimental econophysics is an emerging field; however, there is a lack of experience and paradigms related to the field. Here, we review some of the most recent research results obtained through the use of these two methods concerning financial problems such as chaos, leverage, and business cycles. We also review the principles behind assessments of agents' intelligence levels, and some relevant designs for human experiments. The main theme of this review is to show that by combining theory, agent-based modeling, and controlled human experiments, one can garner more reliable and credible results on account of a better verification of theory; accordingly, this way, a wider range of economic and financial problems and phenomena can be studied.
ERIC Educational Resources Information Center
Fairchild, Susan; Carrino, Gerard; Gunton, Brad; Soderquist, Chris; Hsiao, Andrew; Donohue, Beverly; Farrell, Timothy
2012-01-01
New Visions for Public Schools has leveraged student-level data to help schools identify at-risk students, designed metrics to capture student progress toward graduation, developed data tools and reports that visualize student progress at different levels of aggregation for different audiences, and implemented real-time data systems for educators.…
Code of Federal Regulations, 2010 CFR
2010-04-01
... when and as requested by any authorized representative of the Commission, designated self-regulatory... Commission, designated self-regulatory organization or the U.S. Department of Justice. (d) Each leverage...
A High-Leverage Language Teaching Practice: Leading an Open-Ended Group Discussion
ERIC Educational Resources Information Center
Kearney, Erin
2015-01-01
In response to calls for more practice-based teacher education, this study investigated the way in which two high-performing novice world language teachers, one in Spanish and one in Latin, implemented a high-leverage teaching practice, leading an open-ended group discussion. Observational data revealed a number of constituent micro-practices. The…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-05
... Change Relating to Investments in Leveraged Loans by the Peritus High Yield ETF August 29, 2013. Pursuant... the holdings of the Peritus High Yield ETF to achieve its investment objective to include leveraged loans. Peritus High Yield ETF is currently listed and traded on the Exchange under NYSE Arca Equities...
Oscillator circuit for use with high loss quartz resonator sensors
Wessendorf, Otto
1995-01-01
The disclosure is directed to a Lever oscillator for use in high resistance resonator applications, especially for use with quartz resonator sensors. The oscillator is designed to operate over a wide dynamic range of resonator resistance due to damping of the resonator in mediums such as liquids. An oscillator design is presented that allows both frequency and loss (R.sub.m) of the resonator to be determined over a wide dynamic range of resonator loss. The Lever oscillator uses negative feedback in a differential amplifier configuration to actively and variably divide (or leverage) the resonator impedance such that the oscillator can maintain the phase and gain of the loop over a wide range of resonator resistance.
Hospitals Negotiating Leverage with Health Plans: How and Why Has It Changed?
Devers, Kelly J; Casalino, Lawrence P; Rudell, Liza S; Stoddard, Jeffrey J; Brewster, Linda R; Lake, Timothy K
2003-01-01
Objective To describe how hospitals' negotiating leverage with managed care plans changed from 1996 to 2001 and to identify factors that explain any changes. Data Sources Primary semistructured interviews, and secondary qualitative (e.g., newspaper articles) and quantitative (i.e., InterStudy, American Hospital Association) data. Study Design The Community Tracking Study site visits to a nationally representative sample of 12 communities with more than 200,000 people. These 12 markets have been studied since 1996 using a variety of primary and secondary data sources. Data Collection Methods Semistructured interviews were conducted with a purposive sample of individuals from hospitals, health plans, and knowledgeable market observers. Secondary quantitative data on the 12 markets was also obtained. Principal Findings Our findings suggest that many hospitals' negotiating leverage significantly increased after years of decline. Today, many hospitals are viewed as having the greatest leverage in local markets. Changes in three areas—the policy and purchasing context, managed care plan market, and hospital market—appear to explain why hospitals' leverage increased, particularly over the last two years (2000–2001). Conclusions Hospitals' increased negotiating leverage contributed to higher payment rates, which in turn are likely to increase managed care plan premiums. This trend raises challenging issues for policymakers, purchasers, plans, and consumers. PMID:12650374
An NLRA Transducer for Dual Use Bone Conduction Audio and Haptic Communication. Summary Report
2016-12-30
VIBRANT COMPOSITES INC. 1 A16-019 Phase 1 Summary Report Vibrant Composites Inc. December 30, 2016 I. ABSTRACT A combined transducer capable of bone ...transducer core capable of both precise haptic communication and high fidelity bone conduction audio. The transducer design leverages Micro-Multilayer...head-mounted system. In this Phase I SBIR, Vibrant Composites has delivered functional dual-mode bone conduction and vibrotactile transducer prototypes
2012-04-01
tactical electronic and optical reconnaissance (both high and low altitude); and 3) electronic combat (jamming and chaff dispensing).7 In contrast, the...sites or other radar sites. IAI designed the Harpy as a loitering UAS that would sit over the battlefield and search for electronic emissions from...tactical reconnaissance, and can be modified to carry different payloads for electronic warfare or attack missions. The Hermes 450 is the smallest
Business strategy and financial structure: an empirical analysis of acute care hospitals.
Ginn, G O; Young, G J; Beekun, R I
1995-01-01
This study investigated the relationship between business strategy and financial structure in the U.S. hospital industry. We studied two dimensions of financial structure--liquidity and leverage. Liquidity was assessed by the acid ratio, and leverage was assessed using the equity funding ratio. Drawing from managerial, finance, and resource dependence perspectives, we developed and tested hypotheses about the relationship between Miles and Snow strategy types and financial structure. Relevant contextual financial and organizational variables were controlled for statistically through the Multivariate Analysis of Covariance technique. The relationship between business strategy and financial structure was found to be significant. Among the Miles and Snow strategy types, defenders were found to have relatively high liquidity and low leverage. Prospectors typically had low liquidity and high leverage. Implications for financial planning, competitive assessment, and reimbursement policy are discussed.
Design of a power-asymmetric actuator for a transtibial prosthesis.
Bartlett, Harrison L; Lawson, Brian E; Goldfarb, Michael
2017-07-01
This paper presents the design and characterization of a power-asymmetric actuator for a transtibial prosthesis. The device is designed to provide the combination of: 1) joint locking, 2) high power dissipation, and 3) low power generation. This actuator functionality allows for a prosthesis to be designed with minimal mass and power consumption relative to a fully-powered robotic prosthesis while maintaining much of the functionality necessary for activities of daily living. The actuator achieves these design characteristics while maintaining a small form factor by leveraging a combination of electromechanical and hydraulic components. The design of the actuator is described herein, and results of an experimental characterization are provided that indicate that the actuator is capable of providing the functional capabilities required of an ankle prosthesis in a compact and lightweight package.
Rapidly converging multigrid reconstruction of cone-beam tomographic data
NASA Astrophysics Data System (ADS)
Myers, Glenn R.; Kingston, Andrew M.; Latham, Shane J.; Recur, Benoit; Li, Thomas; Turner, Michael L.; Beeching, Levi; Sheppard, Adrian P.
2016-10-01
In the context of large-angle cone-beam tomography (CBCT), we present a practical iterative reconstruction (IR) scheme designed for rapid convergence as required for large datasets. The robustness of the reconstruction is provided by the "space-filling" source trajectory along which the experimental data is collected. The speed of convergence is achieved by leveraging the highly isotropic nature of this trajectory to design an approximate deconvolution filter that serves as a pre-conditioner in a multi-grid scheme. We demonstrate this IR scheme for CBCT and compare convergence to that of more traditional techniques.
Analysis of debt leveraging in private power projects. Revision
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahn, E.P.; Meal, M.; Doerrer, S.
1992-08-01
As private power (non-utility generation) has grown to become a significant part of the electricity system, increasing concern about its financial implications has arisen. In many cases, the source of this concern has been the substantial reliance of these projects on debt financing. This study examines debt leveraging in private power projects. The policy debate on these issues has typically been conducted at a high level of generality. Critics of the private power industry assert that high debt leveraging confers an unfair competitive advantage by lowering the cost of capital, and that this leveraging is only possible because risks aremore » shifted to the utility. Further, debt leveraging is claimed to be a threat to reliability. On the opposite side, it is argued that debt leveraging imposes costs and obligations not home by utilities, and so there is no financial advantage. The private producers also argue that on balance more risk is shifted away from utilities than to them, and that incentives for reliability are strong. In this study we examine the project finance mechanisms used in private power lending in detail, relying on a sample of actual loan documents. This review and its findings should be relevant to the further evolution of this debate. State regulatory commissions are likely to be interested in it, and Federal legislation to amend the Public Utility Holding Company Act (PUHCA) could require states to consider the implications of debt leveraging in relation to their oversight of utility power purchase programs.« less
Analysis of debt leveraging in private power projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahn, E.P.; Meal, M.; Doerrer, S.
1992-08-01
As private power has grown to become a significant part of the electricity system, increasing concern about its financial implications has arisen. In many cases, the source of this concern has been the substantial reliance of these projects on debt financing. This study examines debt leveraging in private power projects. The policy debate on these issues has typically been conducted at a high level of generality. Critics of the private power industry assert that high debt leveraging confers an unfair competitive advantage by lowering the cost of capital. This leveraging is only possible because risks are shifted to the utility.more » Further, debt leveraging is claimed to be a threat to reliability. On the opposite side, it is argued that debt leveraging imposes costs and obligations not borne by utilities, and so there is no financial advantage. The private producers also argue that on balance more risk is shifted away from utilities than to them, and that incentives for reliability are strong. In this study we examine the project finance mechanisms used in private power lending in detail, relying on a sample of actual loan documents. This review and its findings should be relevant to the further evolution of this debate. State regulatory commissions are likely to be interested in it, and Federal legislation to amend the Public Utility Holding Company Act (PUHCA) could require states to consider the implications of debt leveraging in relation to their oversight of utility power purchase programs.« less
Analysis of debt leveraging in private power projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahn, E.P.; Meal, M.; Doerrer, S.
1992-08-01
As private power (non-utility generation) has grown to become a significant part of the electricity system, increasing concern about its financial implications has arisen. In many cases, the source of this concern has been the substantial reliance of these projects on debt financing. This study examines debt leveraging in private power projects. The policy debate on these issues has typically been conducted at a high level of generality. Critics of the private power industry assert that high debt leveraging confers an unfair competitive advantage by lowering the cost of capital, and that this leveraging is only possible because risks aremore » shifted to the utility. Further, debt leveraging is claimed to be a threat to reliability. On the opposite side, it is argued that debt leveraging imposes costs and obligations not home by utilities, and so there is no financial advantage. The private producers also argue that on balance more risk is shifted away from utilities than to them, and that incentives for reliability are strong. In this study we examine the project finance mechanisms used in private power lending in detail, relying on a sample of actual loan documents. This review and its findings should be relevant to the further evolution of this debate. State regulatory commissions are likely to be interested in it, and Federal legislation to amend the Public Utility Holding Company Act (PUHCA) could require states to consider the implications of debt leveraging in relation to their oversight of utility power purchase programs.« less
Challenges in Securing the Interface Between the Cloud and Pervasive Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lagesse, Brent J
2011-01-01
Cloud computing presents an opportunity for pervasive systems to leverage computational and storage resources to accomplish tasks that would not normally be possible on such resource-constrained devices. Cloud computing can enable hardware designers to build lighter systems that last longer and are more mobile. Despite the advantages cloud computing offers to the designers of pervasive systems, there are some limitations of leveraging cloud computing that must be addressed. We take the position that cloud-based pervasive system must be secured holistically and discuss ways this might be accomplished. In this paper, we discuss a pervasive system utilizing cloud computing resources andmore » issues that must be addressed in such a system. In this system, the user's mobile device cannot always have network access to leverage resources from the cloud, so it must make intelligent decisions about what data should be stored locally and what processes should be run locally. As a result of these decisions, the user becomes vulnerable to attacks while interfacing with the pervasive system.« less
Revere, Debra; Dixon, Brian E.; Hills, Rebecca; Williams, Jennifer L.; Grannis, Shaun J.
2014-01-01
Introduction: Surveillance, or the systematic monitoring of disease within a population, is a cornerstone function of public health. Despite significant investment in information technologies (IT) to improve the public’s health, health care providers continue to rely on manual, spontaneous reporting processes that can result in incomplete and delayed surveillance activities. Background: Participatory design principles advocate including real users and stakeholders when designing an information system to ensure high ecological validity of the product, incorporate relevance and context into the design, reduce misconceptions designers can make due to insufficient domain expertise, and ultimately reduce barriers to adoption of the system. This paper focuses on the collaborative and informal participatory design process used to develop enhanced, IT-enabled reporting processes that leverage available electronic health records in a health information exchange to prepopulate notifiable-conditions report forms used by public health authorities. Methods: Over nine months, public health stakeholders, technical staff, and informatics researchers were engaged in a multiphase participatory design process that included public health stakeholder focus groups, investigator-engineering team meetings, public health survey and census regarding high-priority data elements, and codesign of exploratory prototypes and final form mock-ups. Findings: A number of state-mandated report fields that are not highly used or desirable for disease investigation were eliminated, which allowed engineers to repurpose form space for desired and high-priority data elements and improve the usability of the forms. Our participatory design process ensured that IT development was driven by end user expertise and needs, resulting in significant improvements to the layout and functionality of the reporting forms. Discussion: In addition to informing report form development, engaging with public health end users and stakeholders through the participatory design process provided new insights into public health workflow and allowed the team to quickly triage user requests while managing user expectations within the realm of engineering possibilities. Conclusion: Engaging public health, engineering staff, and investigators in a shared codesigning process ensured that the new forms will not only meet real-life needs but will also support development of a product that will be adopted and, ultimately, improve communicable and infectious disease reporting by clinicians to public health. PMID:25848615
Qualls, Joseph; Russomanno, David J.
2011-01-01
The lack of knowledge models to represent sensor systems, algorithms, and missions makes opportunistically discovering a synthesis of systems and algorithms that can satisfy high-level mission specifications impractical. A novel ontological problem-solving framework has been designed that leverages knowledge models describing sensors, algorithms, and high-level missions to facilitate automated inference of assigning systems to subtasks that may satisfy a given mission specification. To demonstrate the efficacy of the ontological problem-solving architecture, a family of persistence surveillance sensor systems and algorithms has been instantiated in a prototype environment to demonstrate the assignment of systems to subtasks of high-level missions. PMID:22164081
Soldier-Warfighter Operationally Responsive Deployer for Space
NASA Technical Reports Server (NTRS)
Davis, Benny; Huebner, Larry; Kuhns, Richard
2015-01-01
The Soldier-Warfighter Operationally Responsive Deployer for Space (SWORDS) project was a joint project between the U.S. Army Space & Missile Defense Command (SMDC) and NASA. The effort, lead by SMDC, was intended to develop a three-stage liquid bipropellant (liquid oxygen/liquid methane), pressure-fed launch vehicle capable of inserting a payload of at least 25 kg to a 750-km circular orbit. The vehicle design was driven by low cost instead of high performance. SWORDS leveraged commercial industry standards to utilize standard hardware and technologies over customized unique aerospace designs. SWORDS identified broadly based global industries that have achieved adequate levels of quality control and reliability in their products and then designed around their expertise and business motivations.
Robust Modal Filtering and Control of the X-56A Model with Simulated Fiber Optic Sensor Failures
NASA Technical Reports Server (NTRS)
Suh, Peter M.; Chin, Alexander W.; Marvis, Dimitri N.
2014-01-01
The X-56A aircraft is a remotely-piloted aircraft with flutter modes intentionally designed into the flight envelope. The X-56A program must demonstrate flight control while suppressing all unstable modes. A previous X-56A model study demonstrated a distributed-sensing-based active shape and active flutter suppression controller. The controller relies on an estimator which is sensitive to bias. This estimator is improved herein, and a real-time robust estimator is derived and demonstrated on 1530 fiber optic sensors. It is shown in simulation that the estimator can simultaneously reject 230 worst-case fiber optic sensor failures automatically. These sensor failures include locations with high leverage (or importance). To reduce the impact of leverage outliers, concentration based on a Mahalanobis trim criterion is introduced. A redescending M-estimator with Tukey bisquare weights is used to improve location and dispersion estimates within each concentration step in the presence of asymmetry (or leverage). A dynamic simulation is used to compare the concentrated robust estimator to a state-of-the-art real-time robust multivariate estimator. The estimators support a previously-derived mu-optimal shape controller. It is found that during the failure scenario, the concentrated modal estimator keeps the system stable.
Robust Modal Filtering and Control of the X-56A Model with Simulated Fiber Optic Sensor Failures
NASA Technical Reports Server (NTRS)
Suh, Peter M.; Chin, Alexander W.; Mavris, Dimitri N.
2016-01-01
The X-56A aircraft is a remotely-piloted aircraft with flutter modes intentionally designed into the flight envelope. The X-56A program must demonstrate flight control while suppressing all unstable modes. A previous X-56A model study demonstrated a distributed-sensing-based active shape and active flutter suppression controller. The controller relies on an estimator which is sensitive to bias. This estimator is improved herein, and a real-time robust estimator is derived and demonstrated on 1530 fiber optic sensors. It is shown in simulation that the estimator can simultaneously reject 230 worst-case fiber optic sensor failures automatically. These sensor failures include locations with high leverage (or importance). To reduce the impact of leverage outliers, concentration based on a Mahalanobis trim criterion is introduced. A redescending M-estimator with Tukey bisquare weights is used to improve location and dispersion estimates within each concentration step in the presence of asymmetry (or leverage). A dynamic simulation is used to compare the concentrated robust estimator to a state-of-the-art real-time robust multivariate estimator. The estimators support a previously-derived mu-optimal shape controller. It is found that during the failure scenario, the concentrated modal estimator keeps the system stable.
Direct medical cost and utility analysis of diabetics outpatient at Karanganyar public hospital
NASA Astrophysics Data System (ADS)
Eristina; Andayani, T. M.; Oetari, R. A.
2017-11-01
Diabetes Mellitus is a high cost disease, especially in long-term complication treatment. Long-term complication treatment cost was a problem for the patient, it can affect patients quality of life stated with utility value. The purpose of this study was to determine the medical cost, utility value and leverage factors of diabetics outpatient. This study was cross sectional design, data collected from retrospective medical record of the financial and pharmacy department to obtain direct medical cost, utility value taken from EQ-5D-5L questionnaire. Data analyzed by Mann-Whitney and Kruskal-Wallis test. Results of this study were IDR 433,728.00 for the direct medical cost and pharmacy as the biggest cost. EQ-5D-5L questionnaire showed the biggest proportion on each dimension were 61% no problem on mobility dimension, 89% no problems on self-care dimension, 54% slight problems on usual activities dimension, 41% moderate problems on pain/discomfort dimension and 48% moderate problems on anxiety/depresion dimension. Build upon Thailand value set, utility value was 0.833. Direct medical cost was IDR 433,728.00 with leverage factors were pattern therapy, blood glucose level and complication. Utility value was 0.833 with leverage factors were patients characteristic, therapy pattern, blood glucose level and complication.
Techniques for Conducting Effective Concept Design and Design-to-Cost Trade Studies
NASA Technical Reports Server (NTRS)
Di Pietro, David A.
2015-01-01
Concept design plays a central role in project success as its product effectively locks the majority of system life cycle cost. Such extraordinary leverage presents a business case for conducting concept design in a credible fashion, particularly for first-of-a-kind systems that advance the state of the art and that have high design uncertainty. A key challenge, however, is to know when credible design convergence has been achieved in such systems. Using a space system example, this paper characterizes the level of convergence needed for concept design in the context of technical and programmatic resource margins available in preliminary design and highlights the importance of design and cost evaluation learning curves in determining credible convergence. It also provides techniques for selecting trade study cases that promote objective concept evaluation, help reveal unknowns, and expedite convergence within the trade space and conveys general practices for conducting effective concept design-to-cost studies.
News from CEC: High-Leverage Practices in Special Education
ERIC Educational Resources Information Center
TEACHING Exceptional Children, 2017
2017-01-01
In fall 2014, the Council for Exceptional Children's (CEC) Board of Directors approved a proposal from the Professional Standards and Practice Committee (PSPC) to develop a set of high-leverage practices (HLPs) for special education teachers. The CEEDAR Center at the University of Florida, which is funded by the U.S. Department of Education's…
Design Review and Analysis | Water Power | NREL
Design Review and Analysis Design Review and Analysis NREL is leveraging its 35 years of experience devices and components. As part of this effort, NREL researchers provide industry partners with design reviews and analyses. In addition to design reviews, NREL offers technical assistance to solve specific
High-Penetration PV Integration Handbook for Distribution Engineers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seguin, Rich; Woyak, Jeremy; Costyk, David
2016-01-01
This handbook has been developed as part of a five-year research project which began in 2010. The National Renewable Energy Laboratory (NREL), Southern California Edison (SCE), Quanta Technology, Satcon Technology Corporation, Electrical Distribution Design (EDD), and Clean Power Research (CPR) teamed together to analyze the impacts of high-penetration levels of photovoltaic (PV) systems interconnected onto the SCE distribution system. This project was designed specifically to leverage the experience that SCE and the project team would gain during the significant installation of 500 MW of commercial scale PV systems (1-5 MW typically) starting in 2010 and completing in 2015 within SCE’smore » service territory through a program approved by the California Public Utility Commission (CPUC).« less
17 CFR 31.13 - Financial reports of leverage transaction merchants.
Code of Federal Regulations, 2010 CFR
2010-04-01
... designated self-regulatory organization and conforms to minimum financial standards and related reporting requirements set by such designated self-regulatory organization in its bylaws, rules, regulations, or... true and exact copy of each financial report which it files with such designated self-regulatory...
ERIC Educational Resources Information Center
Meuwissen, Kevin W.; Thomas, Andrew L.
2016-01-01
The notion that teacher education should emphasize high-leverage practice, which is research based, represents the complexity of the subject matter, bolsters teachers' understanding of student learning, is adaptable to different curricular circumstances, and can be mastered with regular use, has traction in scholarship. Nevertheless, how teacher…
Design and Development of a Methane Cryogenic Propulsion Stage for Human Mars Exploration
NASA Technical Reports Server (NTRS)
Percy, Thomas K.; Polsgrove, Tara; Turpin, Jason; Alexander, Leslie
2016-01-01
NASA is currently working on the Evolvabe Mars Campaign (EMC) study to outline transportation and mission options for human exploration of Mars. One of the key aspects of the EMC is leveraging current and planned near-term technology investments to build an affordable and evolvable approach to Mars exploration. This leveraging of investments includes the use of high-power Solar Electric Propulsion (SEP) systems, evolved from those currently under development in support of the Asteroid Redirect Mission (ARM), to deliver payloads to Mars. The EMC is considering several transportation options that combine solar electric and chemical propulsion technologies to deliver crew and cargo to Mars. In one primary architecture option, the SEP propulsion system is used to pre-deploy mission elements to Mars while a high-thrust chemical propulsion system is used to send crew on faster ballistic transfers between Earth and Mars. This high-thrust chemical system uses liquid oxygen - liquid methane main propulsion and reaction control systems integrated into the Methane Cryogenic Propulsion Stage (MCPS). Over the past year, there have been several studies completed to provide critical design and development information related to the MCPS. This paper is intended to provide a summary of these efforts. A summary of the current point of departure design for the MCPS is provided as well as an overview of the mission architecture and concept of operations that the MCPS is intended to support. To leverage the capabilities of solar electric propulsion to the greatest extent possible, the EMC architecture pre-deploys to Mars orbit the stages required for returning crew from Mars. While this changes the risk posture of the architecture, it can provide some mass savings by using higher-efficiency systems for interplanetary transfer. However, this does introduce significantly longer flight times to Mars which, in turn, increases the overall lifetime of the stages to as long as 2500 days. This unique aspect to the concept of operations introduces several challenges, specifically related to propellant storage and engine reliability. These challenges and some potential solutions are discussed. Specific focus is provided on two key technology areas; propulsion and cryogenic fluid management. In the area of propulsion development, the development of an integrated methane propulsion system that combines both main propulsion and reaction control is discussed. This includes an overview of potential development paths, areas where development for Mars applications are complementary to development efforts underway in other parts of the aerospace industry, and commonality between the MCPS methane propulsion applications and other Mars elements, including the Mars lander systems. This commonality is a key affordability aspect of the Evolvable Mars Campaign. A similar discussion is provided for cryogenic fluid management technologies including a discussion of how using cryo propulsion in the Mars transportation application not only provides performance benefits but also leverages decades of technology development investments made by NASA and its aerospace contractor community.
Design and Development of a Methane Cryogenic Propulsion Stage for Human Mars Exploration
NASA Technical Reports Server (NTRS)
Percy, Thomas K.; Polsgrove, Tara; Turpin, Jason; Alexander, Leslie
2016-01-01
NASA is currently working on the Evolvabe Mars Campaign (EMC) study to outline transportation and mission options for human exploration of Mars. One of the key aspects of the EMC is leveraging current and planned near-term technology investments to build an affordable and evolvable approach to Mars exploration. This leveraging of investments includes the use of high-power Solar Electric Propulsion (SEP) systems evolved from those currently under development in support of the Asteroid Redirect Mission to deliver payloads to Mars. The EMC is considering several transportation options that combine solar electric and chemical propulsion technologies to deliver crew and cargo to Mars. In one primary architecture option, the SEP propulsion system is used to pre-deploy mission elements to Mars while a high-thrust chemical propulsion system is used to send crew on faster ballistic transfers between Earth and Mars. This high-thrust chemical system uses liquid oxygen - liquid methane main propulsion and reaction control systems integrated into the Methane Cryogenic Propulsion Stage (MCPS). Over the past year, there have been several studies completed to provide critical design and development information related to the MCPS. This paper is intended to provide a summary of these efforts. A summary of the current point of departure design for the MCPS is provided as well as an overview of the mission architecture and concept of operations that the MCPS is intended to support. To leverage the capabilities of solar electric propulsion to the greatest extent possible, the EMC architecture pre-deploys the required stages for returning crew from Mars. While this changes the risk posture of the architecture, it provides mass savings by using higher-efficiency systems for interplanetary transfer. However, this does introduce significantly longer flight times to Mars which, in turn, increases the overall lifetime of the stages to as long as 3000 days. This unique aspect to the concept of operations introduces several challenges, specifically related to propellant storage and engine reliability. These challenges and some potential solutions are discussed. Specific focus is provided on two key technology areas; propulsion and cryogenic fluid management. In the area of propulsion development, the development of an integrated methane propulsion system that combines both main propulsion and reaction control is discussed. This includes an overview of potential development paths, areas where development for Mars applications are complementary to development efforts underway in other parts of the aerospace industry, and commonality between the MCPS methane propulsion applications and other Mars elements, including the Mars lander systems. This commonality is a key affordability aspect of the Evolvable Mars Campaign. A similar discussion is provided for cryogenic fluid management technologies including a discussion of how using cryo-propulsion in the Mars transportation application not only provides performance benefits but also leverages decades of technology development investments made by NASA and its aerospace contractor community.
Structure-from-motion for MAV image sequence analysis with photogrammetric applications
NASA Astrophysics Data System (ADS)
Schönberger, J. L.; Fraundorfer, F.; Frahm, J.-M.
2014-08-01
MAV systems have found increased attention in the photogrammetric community as an (autonomous) image acquisition platform for accurate 3D reconstruction. For an accurate reconstruction in feasible time, the acquired imagery requires specialized SfM software. Current systems typically use high-resolution sensors in pre-planned flight missions from far distance. We describe and evaluate a new SfM pipeline specifically designed for sequential, close-distance, and low-resolution imagery from mobile cameras with relatively high frame-rate and high overlap. Experiments demonstrate reduced computational complexity by leveraging the temporal consistency, comparable accuracy and point density with respect to state-of-the-art systems.
Leveraging Failure in Design Research
ERIC Educational Resources Information Center
Lobato, Joanne; Walters, C. David; Hohensee, Charles; Gruver, John; Diamond, Jaime Marie
2015-01-01
Even in the resource-rich, more ideal conditions of many design-based classroom interventions, unexpected events can lead to disappointing results in student learning. However, if later iterations in a design research study are more successful, the previous failures can provide opportunities for comparisons to reveal subtle differences in…
Formal Learning Sequences and Progression in the Studio: A Framework for Digital Design Education
ERIC Educational Resources Information Center
Wärnestål, Pontus
2016-01-01
This paper examines how to leverage the design studio learning environment throughout long-term Digital Design education in order to support students to progress from tactical, well-defined, device-centric routine design, to confidently design sustainable solutions for strategic, complex, problems for a wide range of devices and platforms in the…
Virtual reality 3D headset based on DMD light modulators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernacki, Bruce E.; Evans, Allan; Tang, Edward
We present the design of an immersion-type 3D headset suitable for virtual reality applications based upon digital micro-mirror devices (DMD). Our approach leverages silicon micro mirrors offering 720p resolution displays in a small form-factor. Supporting chip sets allow rapid integration of these devices into wearable displays with high resolution and low power consumption. Applications include night driving, piloting of UAVs, fusion of multiple sensors for pilots, training, vision diagnostics and consumer gaming. Our design is described in which light from the DMD is imaged to infinity and the user’s own eye lens forms a real image on the user’s retina.
Intelligible machine learning with malibu.
Langlois, Robert E; Lu, Hui
2008-01-01
malibu is an open-source machine learning work-bench developed in C/C++ for high-performance real-world applications, namely bioinformatics and medical informatics. It leverages third-party machine learning implementations for more robust bug-free software. This workbench handles several well-studied supervised machine learning problems including classification, regression, importance-weighted classification and multiple-instance learning. The malibu interface was designed to create reproducible experiments ideally run in a remote and/or command line environment. The software can be found at: http://proteomics.bioengr. uic.edu/malibu/index.html.
Psychological Effects of U.S. Air Operations in Four Wars, 1941 - 1991. Lessons for the Commanders.
1995-01-01
enemy leaders are likely to attach high value to their retention of power and personal survival. To create negotiating leverage from these...and drive civilian workers away from their war- production jobs . Attacks on strategic targets have also been designed to induce other external...finished the job ." See Will p. A21; William Drozdiak, "Armed Dissent in Baghdad, Saddam ReBaans His Washington PqstMay 2, 1991, p. A29; Alan Cowell
Psychological Effects of U.S. Air Operations in Four Wars, 1941-1991. Lessons for U.S. Commanders,
1996-01-01
attach high value to their retention of power and personal survival. To create negotiating leverage from these fundamental enemy inter- ests, a...Objectives 7 to demoralize and drive civilian workers away from their war- production jobs . Attacks on strategic targets have also been designed to...Iraqi put it, the U.S. forces "should have come to Baghdad and finished the job ." See Williams (1991), p. A21; William Drozdiak, "Armed Dissent in
Designing to Support Critical Engagement with Statistics
ERIC Educational Resources Information Center
Gresalfi, Melissa Sommerfeld
2015-01-01
The purpose of this paper is to describe a trajectory of designing for particular forms of engagement with mathematics. The forms of engagement that were targeted through these design experiments involved making intentional choices about which procedures to leverage in order to support particular claims (what I call "critical…
A New Measure of Centrality for Brain Networks
Joyce, Karen E.; Laurienti, Paul J.; Burdette, Jonathan H.; Hayasaka, Satoru
2010-01-01
Recent developments in network theory have allowed for the study of the structure and function of the human brain in terms of a network of interconnected components. Among the many nodes that form a network, some play a crucial role and are said to be central within the network structure. Central nodes may be identified via centrality metrics, with degree, betweenness, and eigenvector centrality being three of the most popular measures. Degree identifies the most connected nodes, whereas betweenness centrality identifies those located on the most traveled paths. Eigenvector centrality considers nodes connected to other high degree nodes as highly central. In the work presented here, we propose a new centrality metric called leverage centrality that considers the extent of connectivity of a node relative to the connectivity of its neighbors. The leverage centrality of a node in a network is determined by the extent to which its immediate neighbors rely on that node for information. Although similar in concept, there are essential differences between eigenvector and leverage centrality that are discussed in this manuscript. Degree, betweenness, eigenvector, and leverage centrality were compared using functional brain networks generated from healthy volunteers. Functional cartography was also used to identify neighborhood hubs (nodes with high degree within a network neighborhood). Provincial hubs provide structure within the local community, and connector hubs mediate connections between multiple communities. Leverage proved to yield information that was not captured by degree, betweenness, or eigenvector centrality and was more accurate at identifying neighborhood hubs. We propose that this metric may be able to identify critical nodes that are highly influential within the network. PMID:20808943
The role of anthropometry in designing for sustainability.
Nadadur, Gopal; Parkinson, Matthew B
2013-01-01
An understanding of human factors and ergonomics facilitates the design of artefacts, tasks and environments that fulfil their users' physical and cognitive requirements. Research in these fields furthers the goal of efficiently accommodating the desired percentage of user populations through enhanced awareness and modelling of human variability. Design for sustainability (DfS) allows for these concepts to be leveraged in the broader context of designing to minimise negative impacts on the environment. This paper focuses on anthropometry and proposes three ways in which its consideration is relevant to DfS: reducing raw material consumption, increasing usage lifetimes and ethical human resource considerations. This is demonstrated through the application of anthropometry synthesis, virtual fitting, and sizing and adjustability allocation methods in the design of an industrial workstation seat for use in five distinct global populations. This work highlights the importance of and opportunities for using ergonomic design principles in DfS efforts. This research demonstrates the relevance of some anthropometry-based ergonomics concepts to the field of design for sustainability. A global design case study leverages human variability considerations in furthering three sustainable design goals: reducing raw material consumption, increasing usage lifetimes and incorporating ethical human resource considerations in design.
ATLAST and JWST Segmented Telescope Design Considerations
NASA Technical Reports Server (NTRS)
Feinberg, Lee
2016-01-01
To the extent it makes sense, leverage JWST (James Webb Space Telescope) knowledge, designs, architectures. GSE (Ground Support Equipment) good starting point. Develop a full end-to-end architecture that closes. Try to avoid recreating the wheel except where needed. Optimize from there (mainly for stability and coronagraphy). Develop a scalable design reference mission (9.2 meters). Do just enough work to understand launch break points in aperture size Demonstrate 10 pm (phase modulation) stability is achievable on a design reference mission. A really key design driver is the most robust stability possible!!! Make design compatible with starshades. While segmented coronagraphs with high throughput and large bandpasses are important, make the system serviceable so you can evolve the instruments. Keep it room temperature to minimize the costs associated with cryo. Focus resources on the contrast problem. Start with the architecture and connect it to the technology needs.
Application of Open Source Technologies for Oceanographic Data Analysis
NASA Astrophysics Data System (ADS)
Huang, T.; Gangl, M.; Quach, N. T.; Wilson, B. D.; Chang, G.; Armstrong, E. M.; Chin, T. M.; Greguska, F.
2015-12-01
NEXUS is a data-intensive analysis solution developed with a new approach for handling science data that enables large-scale data analysis by leveraging open source technologies such as Apache Cassandra, Apache Spark, Apache Solr, and Webification. NEXUS has been selected to provide on-the-fly time-series and histogram generation for the Soil Moisture Active Passive (SMAP) mission for Level 2 and Level 3 Active, Passive, and Active Passive products. It also provides an on-the-fly data subsetting capability. NEXUS is designed to scale horizontally, enabling it to handle massive amounts of data in parallel. It takes a new approach on managing time and geo-referenced array data by dividing data artifacts into chunks and stores them in an industry-standard, horizontally scaled NoSQL database. This approach enables the development of scalable data analysis services that can infuse and leverage the elastic computing infrastructure of the Cloud. It is equipped with a high-performance geospatial and indexed data search solution, coupled with a high-performance data Webification solution free from file I/O bottlenecks, as well as a high-performance, in-memory data analysis engine. In this talk, we will focus on the recently funded AIST 2014 project by using NEXUS as the core for oceanographic anomaly detection service and web portal. We call it, OceanXtremes
NASA Astrophysics Data System (ADS)
Tekin, Tolga; Töpper, Michael; Reichl, Herbert
2009-05-01
Technological frontiers between semiconductor technology, packaging, and system design are disappearing. Scaling down geometries [1] alone does not provide improvement of performance, less power, smaller size, and lower cost. It will require "More than Moore" [2] through the tighter integration of system level components at the package level. System-in-Package (SiP) will deliver the efficient use of three dimensions (3D) through innovation in packaging and interconnect technology. A key bottleneck to the implementation of high-performance microelectronic systems, including SiP, is the lack of lowlatency, high-bandwidth, and high density off-chip interconnects. Some of the challenges in achieving high-bandwidth chip-to-chip communication using electrical interconnects include the high losses in the substrate dielectric, reflections and impedance discontinuities, and susceptibility to crosstalk [3]. Obviously, the incentive for the use of photonics to overcome the challenges and leverage low-latency and highbandwidth communication will enable the vision of optical computing within next generation architectures. Supercomputers of today offer sustained performance of more than petaflops, which can be increased by utilizing optical interconnects. Next generation computing architectures are needed with ultra low power consumption; ultra high performance with novel interconnection technologies. In this paper we will discuss a CMOS compatible underlying technology to enable next generation optical computing architectures. By introducing a new optical layer within the 3D SiP, the development of converged microsystems, deployment for next generation optical computing architecture will be leveraged.
Hybrid architecture for building secure sensor networks
NASA Astrophysics Data System (ADS)
Owens, Ken R., Jr.; Watkins, Steve E.
2012-04-01
Sensor networks have various communication and security architectural concerns. Three approaches are defined to address these concerns for sensor networks. The first area is the utilization of new computing architectures that leverage embedded virtualization software on the sensor. Deploying a small, embedded virtualization operating system on the sensor nodes that is designed to communicate to low-cost cloud computing infrastructure in the network is the foundation to delivering low-cost, secure sensor networks. The second area focuses on securing the sensor. Sensor security components include developing an identification scheme, and leveraging authentication algorithms and protocols that address security assurance within the physical, communication network, and application layers. This function will primarily be accomplished through encrypting the communication channel and integrating sensor network firewall and intrusion detection/prevention components to the sensor network architecture. Hence, sensor networks will be able to maintain high levels of security. The third area addresses the real-time and high priority nature of the data that sensor networks collect. This function requires that a quality-of-service (QoS) definition and algorithm be developed for delivering the right data at the right time. A hybrid architecture is proposed that combines software and hardware features to handle network traffic with diverse QoS requirements.
Understanding Usability: Investigating an Integrated Design Environment and Management System
ERIC Educational Resources Information Center
Lee, Jason Chong; Wahid, Shahtab; McCrickard, D. Scott; Chewar, C. M.; Congleton, Ben
2007-01-01
Purpose: Decades of innovation in designing usable (and unusable) interfaces have resulted in a plethora of guidelines, usability methods, and other design tools. The purpose of this research is to develop ways for novice developers to effectively leverage and contribute to the large and growing body of usability knowledge and methods.…
X-37 Storable Propulsion System Design and Operations
NASA Technical Reports Server (NTRS)
Rodriguez, Henry; Popp, Chris; Rehagen, Ronald J.
2005-01-01
In a response to NASA's X-37 TA-10 Cycle-1 contract, Boeing assessed nitrogen tetroxide (N2O4) and monomethyl hydrazine (MMH) Storable Propellant Propulsion Systems to select a low risk X-37 propulsion development approach. Space Shuttle lessons learned, planetary spacecraft, and Boeing Satellite HS-601 systems were reviewed to arrive at a low risk and reliable storable propulsion system. This paper describes the requirements, trade studies, design solutions, flight and ground operational issues which drove X-37 toward the selection of a storable propulsion system. The design of storable propulsion systems offers the leveraging of hardware experience that can accelerate progress toward critical design. It also involves the experience gained from launching systems using MMH and N2O4 propellants. Leveraging of previously flight-qualified hardware may offer economic benefits and may reduce risk in cost and schedule. This paper summarizes recommendations based on experience gained from Space Shuttle and similar propulsion systems utilizing MMH and N2O4 propellants. System design insights gained from flying storable propulsion are presented and addressed in the context of the design approach of the X-37 propulsion system.
X-37 Storable Propulsion System Design and Operations
NASA Technical Reports Server (NTRS)
Rodriguez, Henry; Popp, Chris; Rehegan, Ronald J.
2006-01-01
In a response to NASA's X-37 TA-10 Cycle-1 contract, Boeing assessed nitrogen tetroxide (N2O4) and monomethyl hydrazine (MMH) Storable Propellant Propulsion Systems to select a low risk X-37 propulsion development approach. Space Shuttle lessons learned, planetary spacecraft, and Boeing Satellite HS-601 systems were reviewed to arrive at a low risk and reliable storable propulsion system. This paper describes the requirements, trade studies, design solutions, flight and ground operational issues which drove X-37 toward the selection of a storable propulsion system. The design of storable propulsion systems offers the leveraging of hardware experience that can accelerate progress toward critical design. It also involves the experience gained from launching systems using MMH and N2O4 propellants. Leveraging of previously flight-qualified hardware may offer economic benefits and may reduce risk in cost and schedule. This paper summarizes recommendations based on experience gained from Space Shuttle and similar propulsion systems utilizing MMH and N2O4 propellants. System design insights gained from flying storable propulsion are presented and addressed in the context of the design approach of the X-37 propulsion system.
Shifted Transversal Design smart-pooling for high coverage interactome mapping
Xin, Xiaofeng; Rual, Jean-François; Hirozane-Kishikawa, Tomoko; Hill, David E.; Vidal, Marc; Boone, Charles; Thierry-Mieg, Nicolas
2009-01-01
“Smart-pooling,” in which test reagents are multiplexed in a highly redundant manner, is a promising strategy for achieving high efficiency, sensitivity, and specificity in systems-level projects. However, previous applications relied on low redundancy designs that do not leverage the full potential of smart-pooling, and more powerful theoretical constructions, such as the Shifted Transversal Design (STD), lack experimental validation. Here we evaluate STD smart-pooling in yeast two-hybrid (Y2H) interactome mapping. We employed two STD designs and two established methods to perform ORFeome-wide Y2H screens with 12 baits. We found that STD pooling achieves similar levels of sensitivity and specificity as one-on-one array-based Y2H, while the costs and workloads are divided by three. The screening-sequencing approach is the most cost- and labor-efficient, yet STD identifies about twofold more interactions. Screening-sequencing remains an appropriate method for quickly producing low-coverage interactomes, while STD pooling appears as the method of choice for obtaining maps with higher coverage. PMID:19447967
Macroeconomic Dynamics of Assets, Leverage and Trust
NASA Astrophysics Data System (ADS)
Rozendaal, Jeroen C.; Malevergne, Yannick; Sornette, Didier
A macroeconomic model based on the economic variables (i) assets, (ii) leverage (defined as debt over asset) and (iii) trust (defined as the maximum sustainable leverage) is proposed to investigate the role of credit in the dynamics of economic growth, and how credit may be associated with both economic performance and confidence. Our first notable finding is the mechanism of reward/penalty associated with patience, as quantified by the return on assets. In regular economies where the EBITA/Assets ratio is larger than the cost of debt, starting with a trust higher than leverage results in the highest long-term return on assets (which can be seen as a proxy for economic growth). Therefore, patient economies that first build trust and then increase leverage are positively rewarded. Our second main finding concerns a recommendation for the reaction of a central bank to an external shock that affects negatively the economic growth. We find that late policy intervention in the model economy results in the highest long-term return on assets. However, this comes at the cost of suffering longer from the crisis until the intervention occurs. The phenomenon that late intervention is most effective to attain a high long-term return on assets can be ascribed to the fact that postponing intervention allows trust to increase first, and it is most effective to intervene when trust is high. These results are derived from two fundamental assumptions underlying our model: (a) trust tends to increase when it is above leverage; (b) economic agents learn optimally to adjust debt for a given level of trust and amount of assets. Using a Markov Switching Model for the EBITA/Assets ratio, we have successfully calibrated our model to the empirical data of the return on equity of the EURO STOXX 50 for the time period 2000-2013. We find that dynamics of leverage and trust can be highly nonmonotonous with curved trajectories, as a result of the nonlinear coupling between the variables. This has an important implication for policy makers, suggesting that simple linear forecasting can be deceiving in some regimes and may lead to inappropriate policy decisions.
Architecture for distributed design and fabrication
NASA Astrophysics Data System (ADS)
McIlrath, Michael B.; Boning, Duane S.; Troxel, Donald E.
1997-01-01
We describe a flexible, distributed system architecture capable of supporting collaborative design and fabrication of semi-conductor devices and integrated circuits. Such capabilities are of particular importance in the development of new technologies, where both equipment and expertise are limited. Distributed fabrication enables direct, remote, physical experimentation in the development of leading edge technology, where the necessary manufacturing resources are new, expensive, and scarce. Computational resources, software, processing equipment, and people may all be widely distributed; their effective integration is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages current vendor and consortia developments to define software interfaces and infrastructure based on existing and merging networking, CIM, and CAD standards. Process engineers and product designers access processing and simulation results through a common interface and collaborate across the distributed manufacturing environment.
Introducing WISDEM:An Integrated System Modeling for Wind Turbines and Plant (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dykes, K.; Graf, P.; Scott, G.
2015-01-01
The National Wind Technology Center wind energy systems engineering initiative has developed an analysis platform to leverage its research capabilities toward integrating wind energy engineering and cost models across wind plants. This Wind-Plant Integrated System Design & Engineering Model (WISDEM) platform captures the important interactions between various subsystems to achieve a better National Wind Technology Center wind energy systems engineering initiative has developed an analysis platform to leverage its research capabilities toward integrating wind energy engineering and cost models across wind plants. This Wind-Plant Integrated System Design & Engineering Model (WISDEM) platform captures the important interactions between various subsystems tomore » achieve a better understanding of how to improve system-level performance and achieve system-level cost reductions. This work illustrates a few case studies with WISDEM that focus on the design and analysis of wind turbines and plants at different system levels.« less
ERIC Educational Resources Information Center
Kim, Paul; Suh, Esther; Song, Donggil
2015-01-01
This exploratory study provides a deeper look into the aspects of students' experience from design-based learning (DBL) activities for fifth grade students. Using design-based research (DBR), this study was conducted on a series of science learning activities leveraging mobile phones with relevant applications and sensors. We observed 3 different…
ERIC Educational Resources Information Center
Schmidt, Matthew; Galyen, Krista; Laffey, James; Babiuch, Ryan; Schmidt, Carla
2014-01-01
Design-based research (DBR) and open source software are both acknowledged as potentially productive ways for advancing learning technologies. These approaches have practical benefits for the design and development process and for building and leveraging community to augment and sustain design and development. This report presents a case study of…
Analysis of Levene's Test under Design Imbalance.
ERIC Educational Resources Information Center
Keyes, Tim K.; Levy, Martin S.
1997-01-01
H. Levene (1960) proposed a heuristic test for heteroscedasticity in the case of a balanced two-way layout, based on analysis of variance of absolute residuals. Conditions under which design imbalance affects the test's characteristics are identified, and a simple correction involving leverage is proposed. (SLD)
Method and apparatus for determining and utilizing a time-expanded decision network
NASA Technical Reports Server (NTRS)
de Weck, Olivier (Inventor); Silver, Matthew (Inventor)
2012-01-01
A method, apparatus and computer program for determining and utilizing a time-expanded decision network is presented. A set of potential system configurations is defined. Next, switching costs are quantified to create a "static network" that captures the difficulty of switching among these configurations. A time-expanded decision network is provided by expanding the static network in time, including chance and decision nodes. Minimum cost paths through the network are evaluated under plausible operating scenarios. The set of initial design configurations are iteratively modified to exploit high-leverage switches and the process is repeated to convergence. Time-expanded decision networks are applicable, but not limited to, the design of systems, products, services and contracts.
Leveraging design thinking to build sustainable mobile health systems.
Eckman, Molly; Gorski, Irena; Mehta, Khanjan
Mobile health, or mHealth, technology has the potential to improve health care access in the developing world. However, the majority of mHealth projects do not expand beyond the pilot stage. A core reason why is because they do not account for the individual needs and wants of those involved. A collaborative approach is needed to integrate the perspectives of all stakeholders into the design and operation of mHealth endeavours. Design thinking is a methodology used to develop and evaluate novel concepts for systems. With roots in participatory processes and self-determined pathways, design thinking provides a compelling framework to understand and apply the needs of diverse stakeholders to mHealth project development through a highly iterative process. The methodology presented in this article provides a structured approach to apply design thinking principles to assess the feasibility of novel mHealth endeavours during early conceptualisation.
Design Science Methodology Applied to a Chemical Surveillance Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Zhuanyi; Han, Kyungsik; Charles-Smith, Lauren E.
Public health surveillance systems gain significant benefits from integrating existing early incident detection systems,supported by closed data sources, with open source data.However, identifying potential alerting incidents relies on finding accurate, reliable sources and presenting the high volume of data in a way that increases analysts work efficiency; a challenge for any system that leverages open source data. In this paper, we present the design concept and the applied design science research methodology of ChemVeillance, a chemical analyst surveillance system.Our work portrays a system design and approach that translates theoretical methodology into practice creating a powerful surveillance system built for specificmore » use cases.Researchers, designers, developers, and related professionals in the health surveillance community can build upon the principles and methodology described here to enhance and broaden current surveillance systems leading to improved situational awareness based on a robust integrated early warning system.« less
Focusing metasurface quantum-cascade laser with a near diffraction-limited beam
Xu, Luyao; Chen, Daguan; Itoh, Tatsuo; ...
2016-10-17
A terahertz vertical-external-cavity surface-emitting-laser (VECSEL) is demonstrated using an active focusing reflectarray metasurface based on quantum-cascade gain material. The focusing effect enables a hemispherical cavity with flat optics, which exhibits higher geometric stability than a plano-plano cavity and a directive and circular near-diffraction limited Gaussian beam with M 2 beam parameter as low as 1.3 and brightness of 1.86 × 10 6 Wsr –1m –2. As a result, this work initiates the potential of leveraging inhomogeneous metasurface and reflectarray designs to achieve high-power and high-brightness terahertz quantum-cascade VECSELs.
Innovative divertor concept development on DIII-D and EAST
Guo, H. Y.; Allen, S.; Canik, J.; ...
2016-06-02
A critical issue facing the design and operation of next-step high-power steady-state fusion devices is the control of heat fluxes and erosion at the plasma-facing components, in particular, the divertor target plates. A new initiative has been launched on DIII-D to develop and demonstrate innovative boundary plasma-materials interface solutions. The central purposes of this new initiative are to advance scientific understanding in this critical area and develop an advanced divertor concept for application to next-step fusion devices. Finally, DIII-D will leverage strong collaborative efforts on the EAST superconducting tokamak for extending integrated high performance advanced divertor solutions to true steady-state.
Petascale supercomputing to accelerate the design of high-temperature alloys
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; ...
2017-10-25
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
NASA Astrophysics Data System (ADS)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; Haynes, J. Allen
2017-12-01
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ‧-Al2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviour of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. The approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.
Fiber-Based Lasers as an Option for GRACE Follow-On Light Source
NASA Technical Reports Server (NTRS)
Camp, Jordan
2010-01-01
Fiber based lasers offer a number of attractive characteristics for space application: state of the art laser technology, leverage of design and reliability from the substantial investments of the telecon industry, and convenient redundancy of higher risk components through fiber splicing. At NASA/Goddard we are currently investigating three GFO fiber-based laser options: a fiber oscillator built in our laboratory; an effort to space qualify a commercial design that uses a proprietary high-gain fiber cavity; and the space qualification of a promising new commercial external cavity laser, notable for its low-mass, compact design. In my talk I will outline these efforts, and suggest that the GFO Project may soon have the option of a US laser vendor for its light source.
ERIC Educational Resources Information Center
Ryan, Kenneth; Kopischke, Kevin
2008-01-01
The Remote Automation Management Platform (RAMP) is a real-time, interactive teaching tool which leverages common off-the-shelf internet technologies to provide high school learners extraordinary access to advanced technical education opportunities. This outreach paradigm is applicable to a broad range of advanced technical skills from automation…
The Fidelity and Usability of 5-DIE: A Design Study of Enacted Cyberlearning
ERIC Educational Resources Information Center
Kern, Cindy L.; Crippen, Kent J.; Skaza, Heather
2014-01-01
This paper describes a design study of a cyberlearning instructional unit about climate change created with a new inquiry-based design framework, the 5-featured Dynamic Inquiry Enterprise (5-DIE). The 5-DIE framework was created to address the need for authentic science inquiry experiences in cyberlearning environments that leverage existing tools…
ERIC Educational Resources Information Center
Chee, Yam San
2014-01-01
Design research has been positioned as an important methodological contribution of the learning sciences. Despite the publication of a handbook on the subject, the practice of design research in education remains an eclectic collection of specific approaches implemented by different researchers and research groups. In this paper, I examine the…
An Overview of Selected Theories about Student Learning
ERIC Educational Resources Information Center
Goel, Sanjay
2011-01-01
Engineering educators are often not familiar with the theories and research findings of educational psychology, adult development, curriculum design, and instruction design. Even the published research in engineering/computing education does not sufficiently leverage this body of knowledge. Often in the educational reports and recommendations by…
Leveraging Guided Pathways to Improve Financial Aid Design and Delivery
ERIC Educational Resources Information Center
Luna-Torres, Maria; Leafgreen, Melet; McKinney, Lyle
2017-01-01
To address low completion rates, postsecondary leaders are championing a "guided pathways" approach that puts students on a prescribed route towards graduation. Designing solutions to address low completion rates is complex; in addition to academic roadblocks, insufficient financial resources coupled with a complicated financial aid…
Leveraging Information Technology. Track II: Innovative Management.
ERIC Educational Resources Information Center
CAUSE, Boulder, CO.
Seven papers from the 1987 CAUSE conference's Track II, Innovative Management, are presented. They include: "Is This Creative, or What!" (Kenneth C. Blythe); "Joint Application Design: Can a User Committee Design a System in Four Days?" (Diane Kent, David Smithers); "Making It Happen without Appropriation" (Robert E.…
Design and Control of a Pneumatically Actuated Transtibial Prosthesis.
Zheng, Hao; Shen, Xiangrong
2015-04-01
This paper presents the design and control of a pneumatically actuated transtibial prosthesis, which utilizes a pneumatic cylinder-type actuator to power the prosthetic ankle joint to support the user's locomotion. The pneumatic actuator has multiple advantages over the traditional electric motor, such as light weight, low cost, and high power-to-weight ratio. The objective of this work is to develop a compact and lightweight transtibial prosthesis, leveraging the multiple advantages provided by this highly competitive actuator. In this paper, the design details of the prosthesis are described, including the determination of performance specifications, the layout of the actuation mechanism, and the calculation of the torque capacity. Through the authors' design calculation, the prosthesis is able to provide sufficient range of motion and torque capacity to support the locomotion of a 75 kg individual. The controller design is also described, including the underlying biomechanical analysis and the formulation of the finite-state impedance controller. Finally, the human subject testing results are presented, with the data indicating that the prosthesis is able to generate a natural walking gait and sufficient power output for its amputee user.
Design and Control of a Pneumatically Actuated Transtibial Prosthesis
Zheng, Hao; Shen, Xiangrong
2015-01-01
This paper presents the design and control of a pneumatically actuated transtibial prosthesis, which utilizes a pneumatic cylinder-type actuator to power the prosthetic ankle joint to support the user's locomotion. The pneumatic actuator has multiple advantages over the traditional electric motor, such as light weight, low cost, and high power-to-weight ratio. The objective of this work is to develop a compact and lightweight transtibial prosthesis, leveraging the multiple advantages provided by this highly competitive actuator. In this paper, the design details of the prosthesis are described, including the determination of performance specifications, the layout of the actuation mechanism, and the calculation of the torque capacity. Through the authors’ design calculation, the prosthesis is able to provide sufficient range of motion and torque capacity to support the locomotion of a 75 kg individual. The controller design is also described, including the underlying biomechanical analysis and the formulation of the finite-state impedance controller. Finally, the human subject testing results are presented, with the data indicating that the prosthesis is able to generate a natural walking gait and sufficient power output for its amputee user. PMID:26146497
Predictive Design of Interfacial Functionality in Polymer Matrix Composites
2017-05-24
structural design criteria. Due to the poor accessibility of interfaces by experimental means, little is known about the molecular definition, defect...is designed to allow for concurrent light scattering measurements, which establishes a unique experimental resource. We were able to leverage this...AFRL-AFOSR-VA-TR-2017-0103 Predictive Design of Interfacial Functionality in Polymer Matrix Composites John Kieffer UNIVERSITY OF MICHIGAN 503
Not Dean School: Leadership Development for Faculty Where They Are
ERIC Educational Resources Information Center
Wilks, Karrin E.; Shults, Christopher; Berg, James J.
2018-01-01
Leadership development for faculty often is designed as training for administration, but faculty demonstrate leadership in the classroom, in their departments, college-wide, and beyond. To fully realize and leverage this leadership potential, colleges must design opportunities for faculty to hone their knowledge and skills as active participants…
ERIC Educational Resources Information Center
Karchmer-Klein, Rachel; Mouza, Chrystalla; Harlow Shinas, Valerie; Park, Sohee
2017-01-01
The purpose of this study was to examine patterns evident in the ways middle school teachers, who value technology integration, design instruction that leverages educational applications (app) affordances. Using the pedagogy of multiliteracies (Cope & Kalantzis, 2015) and app affordances of multimodality, collaboration, and interactivity as…
Industrial Sponsor Perspective on Leveraging Capstone Design Projects to Enhance Their Business
ERIC Educational Resources Information Center
Weissbach, Robert S.; Snyder, Joseph W.; Evans, Edward R., Jr.; Carucci, James R., Jr.
2017-01-01
Capstone design projects have become commonplace among engineering and engineering technology programs. These projects are valuable tools when assessing students, as they require students to work in teams, communicate effectively, and demonstrate technical competency. The use of industrial sponsors enhances these projects by giving these projects…
Multilevel Design of School Effectiveness Studies in Sub-Saharan Africa
ERIC Educational Resources Information Center
Kelcey, Ben; Shen, Zuchao
2016-01-01
School-based improvement programs represent a core strategy in improving education because they can leverage pre-existing social and organizational structures to promote coordinated and comprehensive change across multiple facets of schooling. School-based programs are generally designed to be implemented by intact schools/districts, frequently…
Big Data | Transportation Research | NREL
Designs Leveraging Fleet DNA data to characterize real-world duty cycles for urban delivery vehicles, NREL -extended electric vehicles for urban delivery applications, targeting efficiency improvements of 50
ERIC Educational Resources Information Center
DiGiacomo, Daniela Kruel; Gutiérrez, Kris D.
2017-01-01
Drawing upon four years of research within a social design experiment, we focus on how teacher learning can be supported in designed environments that are organized around robust views of learning, culture, and equity. We illustrate both the possibility and difficulty of helping teachers disrupt the default teaching scripts that privilege…
NASA Astrophysics Data System (ADS)
Kastens, K. A.; Krumhansl, R.
2016-12-01
The Next Generation Science Standards incorporate a stronger emphasis on having students work with data than did prior standards. This emphasis is most obvious in Practice 4: Analyzing and Interpreting Data, but also permeates performance expectations built on Practice 2 when students test models, Practice 6 when students construct explanations, and Practice 7 when student test claims with evidence. To support curriculum developers who wish to guide high school students towards more sophisticated engagement with complex data, we analyzed a well-regarded body of instructional materials designed for use in introductory college courses (http://serc.carleton.edu/integrate/teaching_materials/). Our analysis sought design patterns that can be reused for a variety of topics at the high school or college level. We found five such patterns, each of which was used in at least half of the modules analyzed. We describe each pattern, provide an example, and hypothesize a theory of action that could explain how the sequence of activities leverages known perceptual, cognitive and/or social processes to foster learning from and about data. In order from most to least frequent, the observed design patterns are as follows: In Data Puzzles, students respond to guiding questions about high-value snippets of data pre-selected and sequenced by the curriculum developer to lead to an Aha! inference. In Pooling Data to See the Big Picture, small groups analyze different instances of analogous phenomenon (e.g. different hurricanes, or different divergent plate boundaries) and pool their insights to extract the commonalities that constitute the essence of that phenomenon. In Make a Decision or Recommendation, students combine geoscience data with other factors (such as economic or environmental justice concerns) to make a decision or recommendation about a human or societal action. In Predict-Observe-Explain, students make a prediction about what the Earth will look like under conditions they have not yet seen and test their prediction with data. In Nested Data Sets, students first interpret local data leveraging field experience or life experience, and then expand their interpretation across larger spatial or temporal scales, drawing on lines of reasoning developed at the local scale.
Enabling Large-Scale Design, Synthesis and Validation of Small Molecule Protein-Protein Antagonists
Koes, David; Khoury, Kareem; Huang, Yijun; Wang, Wei; Bista, Michal; Popowicz, Grzegorz M.; Wolf, Siglinde; Holak, Tad A.; Dömling, Alexander; Camacho, Carlos J.
2012-01-01
Although there is no shortage of potential drug targets, there are only a handful known low-molecular-weight inhibitors of protein-protein interactions (PPIs). One problem is that current efforts are dominated by low-yield high-throughput screening, whose rigid framework is not suitable for the diverse chemotypes present in PPIs. Here, we developed a novel pharmacophore-based interactive screening technology that builds on the role anchor residues, or deeply buried hot spots, have in PPIs, and redesigns these entry points with anchor-biased virtual multicomponent reactions, delivering tens of millions of readily synthesizable novel compounds. Application of this approach to the MDM2/p53 cancer target led to high hit rates, resulting in a large and diverse set of confirmed inhibitors, and co-crystal structures validate the designed compounds. Our unique open-access technology promises to expand chemical space and the exploration of the human interactome by leveraging in-house small-scale assays and user-friendly chemistry to rationally design ligands for PPIs with known structure. PMID:22427896
Leverage points for sustainability transformation.
Abson, David J; Fischer, Joern; Leventon, Julia; Newig, Jens; Schomerus, Thomas; Vilsmaier, Ulli; von Wehrden, Henrik; Abernethy, Paivi; Ives, Christopher D; Jager, Nicolas W; Lang, Daniel J
2017-02-01
Despite substantial focus on sustainability issues in both science and politics, humanity remains on largely unsustainable development trajectories. Partly, this is due to the failure of sustainability science to engage with the root causes of unsustainability. Drawing on ideas by Donella Meadows, we argue that many sustainability interventions target highly tangible, but essentially weak, leverage points (i.e. using interventions that are easy, but have limited potential for transformational change). Thus, there is an urgent need to focus on less obvious but potentially far more powerful areas of intervention. We propose a research agenda inspired by systems thinking that focuses on transformational 'sustainability interventions', centred on three realms of leverage: reconnecting people to nature, restructuring institutions and rethinking how knowledge is created and used in pursuit of sustainability. The notion of leverage points has the potential to act as a boundary object for genuinely transformational sustainability science.
NASA Astrophysics Data System (ADS)
Cicak, Katarina; Lecocq, Florent; Ranzani, Leonardo; Peterson, Gabriel A.; Kotler, Shlomi; Teufel, John D.; Simmonds, Raymond W.; Aumentado, Jose
Recent developments in coupled mode theory have opened the doors to new nonreciprocal amplification techniques that can be directly leveraged to produce high quantum efficiency in current measurements in microwave quantum information. However, taking advantage of these techniques requires flexible multi-mode circuit designs comprised of low-loss materials that can be implemented using common fabrication techniques. In this talk we discuss the design and fabrication of a new class of multi-pole lumped-element superconducting parametric amplifiers based on Nb/Al-AlOx/Nb Josephson junctions on silicon or sapphire. To reduce intrinsic loss in these circuits we utilize PECVD amorphous silicon as a low-loss dielectric (tanδ 5 ×10-4), resulting in nearly quantum-limited directional amplification.
Schmitz, Max; Dähler, Fabian; Elvinger, François; Pedretti, Andrea; Steinfeld, Aldo
2017-04-10
We introduce a design methodology for nonimaging, single-reflection mirrors with polygonal inlet apertures that generate a uniform irradiance distribution on a polygonal outlet aperture, enabling a multitude of applications within the domain of concentrated photovoltaics. Notably, we present single-mirror concentrators of square and hexagonal perimeter that achieve very high irradiance uniformity on a square receiver at concentrations ranging from 100 to 1000 suns. These optical designs can be assembled in compound concentrators with maximized active area fraction by leveraging tessellation. More advanced multi-mirror concentrators, where each mirror individually illuminates the whole area of the receiver, allow for improved performance while permitting greater flexibility for the concentrator shape and robustness against partial shading of the inlet aperture.
Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley; Lung, Shun-fat
2008-01-01
An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-07
..., inform training and employment program design and investment decision- making, support consultations with strategic partners, and leverage limited labor market information-workforce information (LMI-WI) program...
Arcus: Exploring the formation and evolution of clusters, galaxies, and stars
NASA Astrophysics Data System (ADS)
Smith, Randall K.
2017-08-01
Arcus, a proposed soft X-ray grating spectrometer Explorer, leverages recent advances in critical-angle transmission (CAT) gratings and silicon pore optics (SPOs), using CCDs with strong Suzaku heritage and electronics based on the Swift mission; both the spacecraft and mission operations reuse highly successful designs. To be launched in 2023, Arcus will be the only observatory capable of studying, in detail, the hot galactic and intergalactic gas that is the dominant baryonic component of the present-day Universe and ultimate reservoir of entropy, metals and the output from cosmic feedback. Its superior soft (12-50Å) X-ray sensitivity will complement forthcoming calorimeters, which will have comparably high spectral resolution above 2 keV.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Leary, Patrick
The framework created through the Open-Source Integrated Design-Analysis Environment (IDAE) for Nuclear Energy Advanced Modeling & Simulation grant has simplify and democratize advanced modeling and simulation in the nuclear energy industry that works on a range of nuclear engineering applications. It leverages millions of investment dollars from the Department of Energy's Office of Nuclear Energy for modeling and simulation of light water reactors and the Office of Nuclear Energy's research and development. The IDEA framework enhanced Kitware’s Computational Model Builder (CMB) while leveraging existing open-source toolkits and creating a graphical end-to-end umbrella guiding end-users and developers through the nuclear energymore » advanced modeling and simulation lifecycle. In addition, the work deliver strategic advancements in meshing and visualization for ensembles.« less
Invoking the User from Data to Design
ERIC Educational Resources Information Center
Tempelman-Kluit, Nadaleen; Pearce, Alexa
2014-01-01
Personas, stemming from the field of user-centered design (UCD), are hypothetical users that represent the behaviors, goals, and values of actual users. This study describes the creation of personas in an academic library. With the goal of leveraging service-generated data, the authors coded a sample of chat reference transcripts, producing two…
Learning and Learning-to-Learn by Doing: Simulating Corporate Practice in Law School.
ERIC Educational Resources Information Center
Okamoto, Karl S.
1995-01-01
A law school course in advanced corporate legal practice is described. The course, a series of simulated lawyering tasks centered on a hypothetical leveraged buyout transaction, is designed to go beyond basic legal analysis to develop professional expertise in legal problem solving. The course description includes goals, syllabus design,…
The Kinematic Analysis of Flat Leverage Mechanism of the Third Class
NASA Astrophysics Data System (ADS)
Zhauyt, A.; Mamatova, G.; Abdugaliyeva, G.; Alipov, K.; Sakenova, A.; Alimbetov, A.
2017-10-01
It is necessary to make link mechanisms calculation to the strength at designing of flat link mechanisms of high class after definition of block diagrams and link linear sizes i.e. it is rationally to choose their forms and to determine the section sizes. The algorithm of the definition of dimension of link mechanism lengths of high classes (MHC) and their metric parameters at successive approach is offered in this work. It this paper educational and research software named GIM is presented. This software has been developed with the aim of approaching the difficulties students usually encounter when facing up to kinematic analysis of mechanisms. A deep understanding of the kinematic analysis is necessary to go a step further into design and synthesis of mechanisms. In order to support and complement the theoretical lectures, GIM software is used during the practical exercises, serving as an educational complementary tool reinforcing the knowledge acquired by the students.
Library Design-Facilitated High-Throughput Sequencing of Synthetic Peptide Libraries.
Vinogradov, Alexander A; Gates, Zachary P; Zhang, Chi; Quartararo, Anthony J; Halloran, Kathryn H; Pentelute, Bradley L
2017-11-13
A methodology to achieve high-throughput de novo sequencing of synthetic peptide mixtures is reported. The approach leverages shotgun nanoliquid chromatography coupled with tandem mass spectrometry-based de novo sequencing of library mixtures (up to 2000 peptides) as well as automated data analysis protocols to filter away incorrect assignments, noise, and synthetic side-products. For increasing the confidence in the sequencing results, mass spectrometry-friendly library designs were developed that enabled unambiguous decoding of up to 600 peptide sequences per hour while maintaining greater than 85% sequence identification rates in most cases. The reliability of the reported decoding strategy was additionally confirmed by matching fragmentation spectra for select authentic peptides identified from library sequencing samples. The methods reported here are directly applicable to screening techniques that yield mixtures of active compounds, including particle sorting of one-bead one-compound libraries and affinity enrichment of synthetic library mixtures performed in solution.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems.
Mahadevan, Vijay S; Merzari, Elia; Tautges, Timothy; Jain, Rajeev; Obabko, Aleksandr; Smith, Michael; Fischer, Paul
2014-08-06
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework.
NASA Technical Reports Server (NTRS)
Rincon, Rafael F.
2008-01-01
The reconfigurable L-Band radar is an ongoing development at NASA/GSFC that exploits the capability inherently in phased array radar systems with a state-of-the-art data acquisition and real-time processor in order to enable multi-mode measurement techniques in a single radar architecture. The development leverages on the L-Band Imaging Scatterometer, a radar system designed for the development and testing of new radar techniques; and the custom-built DBSAR processor, a highly reconfigurable, high speed data acquisition and processing system. The radar modes currently implemented include scatterometer, synthetic aperture radar, and altimetry; and plans to add new modes such as radiometry and bi-static GNSS signals are being formulated. This development is aimed at enhancing the radar remote sensing capabilities for airborne and spaceborne applications in support of Earth Science and planetary exploration This paper describes the design of the radar and processor systems, explains the operational modes, and discusses preliminary measurements and future plans.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems
Mahadevan, Vijay S.; Merzari, Elia; Tautges, Timothy; Jain, Rajeev; Obabko, Aleksandr; Smith, Michael; Fischer, Paul
2014-01-01
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework. PMID:24982250
An Investigation of New Snow Water Equivalence Sensing Modalities
NASA Astrophysics Data System (ADS)
Frolik, J.; Skalka, C.; Wemple, B.
2008-12-01
It is well known that snowpack is highly variable and influenced by a range of factors, including topography and vegetation cover. As such, single point measurements may be viewed as being inadequate to characterize snowpack in a given area. Thus motivated by the desire for distributed sensing, this work presents results of a proof-of-concept investigation for new, low-cost, snow water equivalence (SWE) sensors based on the attenuation of microwave and gamma radiation. First, our work considers the attenuation of microwave signals at 2.4 GHz and 5 GHz due to an accumulating snowpack. These frequencies coincide with those used for common wireless networks and thus our proposed sensor can leverage existing hardware designs which are low-cost and power efficient. Second, we present attenuation data for radiation energy occurring between 500 keV and 1 MeV. These results were obtained utilizing a radiation detector based on Cadmium Zinc Telluride (CZT) technology. The proposed sensor will leverage recent investments such CZT based designs for homeland security applications. We contend that sensors based on these modalities will be low-cost and low-energy and thus readily integrated with wireless sensor network hardware for distributed monitoring. In addition, these sensors will be compact and thus can be placed in locations not feasible for current SWE sensor designs (e.g., snow pillows) or in locations too dangerous for snow course measurements (e.g., areas prone to avalanche). Since neither sensing methods requires contact with the snowpack, these modalities are also immune to snow bridging effects which plague existing designs. We also present preliminary findings of work conducted in a mountainous forested setting in northern New England which examines the influence of forest vegetation on snowpack.
Future NTP Development Synergy Leveraged from Current J-2X Engine Development
NASA Astrophysics Data System (ADS)
Ballard, Richard O.
2008-01-01
This paper is a discussion of how the many long-lead development elements required for the realization of a future nuclear thermal propulsion (NTP) system can be effectively leveraged from the ongoing work being conducted on the J-2X engine program for the Constellation Program. Development studies conducted to date for NTP forward planning have identified a number of technical areas that will require advancement to acceptable technology readiness levels (TRLs) before they can be utilized in NTP system development. These include high-temperature, high-area ratio nozzle extension; long-life, low-NPSP turbomachinery; and low-boiloff propellant management, and a qualified nuclear fuel element. The current J-2X program is working many of these areas that can be leveraged to support NTP development in a highly compatible and synergistic fashion. In addition to supporting technical development, there are other programmatic issues being worked in the J-2X program that can be leveraged by a future NTP development program. These include compliance with recently-evolved space system requirements such as human-rating, fault tolerance and fracture control. These and other similar mandatory system requirements have been adopted by NASA and can result in a significant technical impact beyond elevation of the root technologies required by NTP. Finally, the exploitation of experience, methodologies, and procedures developed by the J-2X program in the areas of verification, qualification, certification, altitude simulation testing, and facility definition will be especially applicable to a future NTP system. The similarities in system mission (in-space propulsion) and operational environment (vacuum, zero-gee) between J-2X and NTP make this highly synergistic. Thus, it can be shown that the collective benefit of leveraging experience and technologies developed during the J-2X program can result in significant savings in development cost and schedule for NTP.
Future NTP Development Synergy Leveraged from Current J-2X Engine Development
NASA Technical Reports Server (NTRS)
Ballard, Richard O.
2008-01-01
This paper is a discussion of how the many long-lead development elements required for the realization of a future nuclear thermal propulsion (NTP) system can be effectively leveraged from the ongoing work being conducted on the J-2X engine program for the Constellation Program. Development studies conducted to date for NTP forward planning have identified a number of technical areas that will require advancement to acceptable technology readiness levels (TRLs) before they can be utilized in NTP system development. These include high-temperature, high-area ratio nozzle extension; long-life, low-NPSP. turbomachinery; and low-boiloff propellant management; and a qualified nuclear fuel element. The current J-2X program is working many of these areas that can be leveraged to support NTP development in a highly compatible and synergistic fashion. In addition to supporting technical development, there are other programmatic issues being worked in the J-2X program that can be leveraged by a future NTP development program. These include compliance with recently-evolved space system requirements such as human-rating, fault tolerance and fracture control. These and other similar mandatory system requirements have been adopted by NASA and can result in a significant technical impact beyond elevation of the root technologies required by NTP. Finally, the exploitation of experience, methodologies, and procedures developed by the J-2X program in the areas of verification, qualification, certification, altitude simulation testing, and facility definition will be especially applicable to a future NTP system. The similarities in system mission (in-space propulsion) and operational environment (vacuum, zero-gee) between J-2X and NTP make this highly synergistic. Thus, it can be $hown that the collective benefit of leveraging experience and technologies developed during the J-2X program can result in significant savings in development cost and schedule for NTP.
Practical 3D Printing of Antennas and RF Electronics
2017-03-01
Passive RF; Combiners Introduction Additive manufacturing can reduce the time and material costs in a design cycle and enable the on-demand printing of...performance, and create Computer Assisted Manufacturing (CAM) files. By intelligently leveraging this process, the design can be readily updated or...advances in 3D printing technology now enable antennas and RF electronics to be designed and prototyped significantly faster than conventional
A Parallel Trade Study Architecture for Design Optimization of Complex Systems
NASA Technical Reports Server (NTRS)
Kim, Hongman; Mullins, James; Ragon, Scott; Soremekun, Grant; Sobieszczanski-Sobieski, Jaroslaw
2005-01-01
Design of a successful product requires evaluating many design alternatives in a limited design cycle time. This can be achieved through leveraging design space exploration tools and available computing resources on the network. This paper presents a parallel trade study architecture to integrate trade study clients and computing resources on a network using Web services. The parallel trade study solution is demonstrated to accelerate design of experiments, genetic algorithm optimization, and a cost as an independent variable (CAIV) study for a space system application.
Zhang, Liu-Xia; Cao, Yi-Ren; Xiao, Hua; Liu, Xiao-Ping; Liu, Shao-Rong; Meng, Qing-Hua; Fan, Liu-Yin; Cao, Cheng-Xi
2016-03-15
In the present work we address a simple, rapid and quantitative analytical method for detection of different proteins present in biological samples. For this, we proposed the model of titration of double protein (TDP) and its relevant leverage theory relied on the retardation signal of chip moving reaction boundary electrophoresis (MRBE). The leverage principle showed that the product of the first protein content and its absolute retardation signal is equal to that of the second protein content and its absolute one. To manifest the model, we achieved theoretical self-evidence for the demonstration of the leverage principle at first. Then relevant experiments were conducted on the TDP-MRBE chip. The results revealed that (i) there was a leverage principle of retardation signal within the TDP of two pure proteins, and (ii) a lever also existed within these two complex protein samples, evidently demonstrating the validity of TDP model and leverage theory in MRBE chip. It was also showed that the proposed technique could provide a rapid and simple quantitative analysis of two protein samples in a mixture. Finally, we successfully applied the developed technique for the quantification of soymilk in adulterated infant formula. The TDP-MRBE opens up a new window for the detection of adulteration ratio of the poor food (milk) in blended high quality one. Copyright © 2015 Elsevier B.V. All rights reserved.
Accelerating Use of Sustainable Materials in Transportation Infrastructure
DOT National Transportation Integrated Search
2016-05-01
With the push towards sustainable design of highway infrastructure systems owners have shown interest in leveraging materials with minimal environmental impacts and extended service lives. Within this emphasis most efforts on sustainable material des...
Leveraging Safety Programs to Improve and Support Security Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leach, Janice; Snell, Mark K.; Pratt, R.
2015-10-01
There has been a long history of considering Safety, Security, and Safeguards (3S) as three functions of nuclear security design and operations that need to be properly and collectively integrated with operations. This paper specifically considers how safety programmes can be extended directly to benefit security as part of an integrated facility management programme. The discussion will draw on experiences implementing such a programme at Sandia National Laboratories’ Annular Research Reactor Facility. While the paper focuses on nuclear facilities, similar ideas could be used to support security programmes at other types of high-consequence facilities and transportation activities.
Acceleration of genetic gain in cattle by reduction of generation interval.
Kasinathan, Poothappillai; Wei, Hong; Xiang, Tianhao; Molina, Jose A; Metzger, John; Broek, Diane; Kasinathan, Sivakanthan; Faber, David C; Allan, Mark F
2015-03-02
Genomic selection (GS) approaches, in combination with reproductive technologies, are revolutionizing the design and implementation of breeding programs in livestock species, particularly in cattle. GS leverages genomic readouts to provide estimates of breeding value early in the life of animals. However, the capacity of these approaches for improving genetic gain in breeding programs is limited by generation interval, the average age of an animal when replacement progeny are born. Here, we present a cost-effective approach that combines GS with reproductive technologies to reduce generation interval by rapidly producing high genetic merit calves.
NASA Astrophysics Data System (ADS)
Kaul, T.; Erbert, G.; Maaßdorf, A.; Martin, D.; Crump, P.
2018-02-01
Broad area lasers that are tailored to be most efficient at the highest achievable optical output power are sought by industry to decrease operation costs and improve system performance. Devices using Extreme-Double-ASymmetric (EDAS) epitaxial designs are promising candidates for improved efficiency at high optical output powers due to low series resistance, low optical loss and low carrier leakage. However, EDAS designs leverage ultra-thin p-side waveguides, meaning that the optical mode is shifted into the n-side waveguide, resulting in a low optical confinement in the active region, low gain and hence high threshold current, limiting peak performance. We introduce here explicit design considerations that enable EDAS-based devices to be developed with increased optical confinement in the active layer without changing the p-side layer thicknesses. Specifically, this is realized by introducing a third asymmetric component in the vicinity of the quantum well. We call this approach Extreme-Triple-ASymmetric (ETAS) design. A series of ETAS-based vertical designs were fabricated into broad area lasers that deliver up to 63% power conversion efficiency at 14 W CW optical output power from a 100 μm stripe laser, which corresponds to the operation point of a kW optical output power in a laser bar. The design process, the impact of structural changes on power saturation mechanisms and finally devices with improved performance will be presented.
Passion Play: Will Wright and Games for Science Learning
ERIC Educational Resources Information Center
Ching, Dixie
2012-01-01
Researchers and instructional designers are exploring the possibilities of using video games to support STEM education in the U.S., not only because they are a popular media form among youth, but also because well-designed games often leverage the best features of inquiry learning. Those interested in using games in an educational capacity may…
ERIC Educational Resources Information Center
Saulnier, Bruce M.
2015-01-01
Problems associated with the ubiquitous presence of technology on college campuses are discussed and the concept of the flipped classroom is explained. Benefits of using the flipped classroom to offset issues associated with the presence of technology in the classroom are explored. Fink's Integrated Course Design is used to develop a flipped class…
Kris Gutiérrez: Designing with and for Diversity in the Learning Sciences
ERIC Educational Resources Information Center
Jurow, A. Susan
2016-01-01
This article reviews the significance of the theoretical and practical contributions of Kris Gutiérrez to research on science education. Gutierrez's ideas about design and equity have inspired scholars to investigate how to leverage learners' everyday practices to make meaningful connections to disciplinary-based knowledge and skills. Her work has…
17 CFR 31.17 - Records of leverage transactions.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Records of leverage... LEVERAGE TRANSACTIONS § 31.17 Records of leverage transactions. (a) Each leverage transaction merchant receiving a leverage customer's order shall immediately upon receipt thereof prepare a written record of...
A bi-stable nanoelectromechanical non-volatile memory based on van der Waals force
NASA Astrophysics Data System (ADS)
Soon, Bo Woon; Jiaqiang Ng, Eldwin; Qian, You; Singh, Navab; Julius Tsai, Minglin; Lee, Chengkuo
2013-07-01
By using complementary-metal-oxide-semiconductor processes, a silicon based bi-stable nanoelectromechanical non-volatile memory is fabricated and characterized. The main feature of this device is an 80 nm wide and 3 μm high silicon nanofin (SiNF) of a high aspect ratio (1:35). The switching mechanism is realized by electrostatic actuation between two lateral electrodes, i.e., terminals. Bi-stable hysteresis behavior is demonstrated when the SiNF maintains its contact to one of the two terminals by leveraging on van der Waals force even after voltage bias is turned off. The compelling results indicate that this design is promising for realization of high density non-volatile memory application due to its nano-scale footprint and zero on-hold power consumption.
Leveraging Site Search and Analytics to Maintain a User-Centered Focus
ERIC Educational Resources Information Center
Mitchell, Erik
2011-01-01
Web design is a necessarily iterative process. During the process, it can be difficult to balance the interests and focus of the library site experts and their novice users. It can also be easy to lose focus on the main goals of site use and become wrapped up in the process of design or coding or in the internal politics of site design. Just as…
17 CFR 31.26 - Quarterly reporting requirement.
Code of Federal Regulations, 2010 CFR
2010-04-01
... leverage contract was repurchased, resold or liquidated; (i) The leverage customer account identification number; (j) Whether the leverage customer had a commercial or noncommercial leverage account; (k) Whether the leverage customer was the owner or holder of a proprietary leverage account as defined in § 31.4(e...
Systems design analysis applied to launch vehicle configuration
NASA Technical Reports Server (NTRS)
Ryan, R.; Verderaime, V.
1993-01-01
As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.
Can Interest-Free Finance Limit the Frequency of Crises and the Volatility of the Business Cycle?
2011-05-14
free economics. Scholars from the International Monetary Fund ( IMF ) researched theoretical interest-free finance a great deal during the 1980s. In...inevitably comes, their assets devalue and their leverage increases. Moreover, a higher proportion of firms become critically leveraged. The interest...to currency crises. Whereas high interest payments might send an otherwise profitable firm to bankruptcy and foreclosure, equity financing may
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morrell, William C.; Birkel, Garrett W.; Forrer, Mark
Although recent advances in synthetic biology allow us to produce biological designs more efficiently than ever, our ability to predict the end result of these designs is still nascent. Predictive models require large amounts of high-quality data to be parametrized and tested, which are not generally available. Here, we present the Experiment Data Depot (EDD), an online tool designed as a repository of experimental data and metadata. EDD provides a convenient way to upload a variety of data types, visualize these data, and export them in a standardized fashion for use with predictive algorithms. In this paper, we describe EDDmore » and showcase its utility for three different use cases: storage of characterized synthetic biology parts, leveraging proteomics data to improve biofuel yield, and the use of extracellular metabolite concentrations to predict intracellular metabolic fluxes.« less
Creating semiconductor metafilms with designer absorption spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Soo Jin; Fan, Pengyu; Kang, Ju-Hyung
The optical properties of semiconductors are typically considered intrinsic and fixed. Here we leverage the rapid developments in the field of optical metamaterials to create ultrathin semiconductor metafilms with designer absorption spectra. We show how such metafilms can be constructed by placing one or more types of high-index semiconductor antennas into a dense array with subwavelength spacings. It is argued that the large absorption cross-section of semiconductor antennas and their weak near-field coupling open a unique opportunity to create strongly absorbing metafilms whose spectral absorption properties directly reflect those of the individual antennas. Using experiments and simulations, we demonstrate thatmore » near-unity absorption at one or more target wavelengths of interest can be achieved in a sub-50-nm-thick metafilm using judiciously sized and spaced Ge nanobeams. The ability to create semiconductor metafilms with custom absorption spectra opens up new design strategies for planar optoelectronic devices and solar cells.« less
Morrell, William C.; Birkel, Garrett W.; Forrer, Mark; ...
2017-08-21
Although recent advances in synthetic biology allow us to produce biological designs more efficiently than ever, our ability to predict the end result of these designs is still nascent. Predictive models require large amounts of high-quality data to be parametrized and tested, which are not generally available. Here, we present the Experiment Data Depot (EDD), an online tool designed as a repository of experimental data and metadata. EDD provides a convenient way to upload a variety of data types, visualize these data, and export them in a standardized fashion for use with predictive algorithms. In this paper, we describe EDDmore » and showcase its utility for three different use cases: storage of characterized synthetic biology parts, leveraging proteomics data to improve biofuel yield, and the use of extracellular metabolite concentrations to predict intracellular metabolic fluxes.« less
Morrell, William C; Birkel, Garrett W; Forrer, Mark; Lopez, Teresa; Backman, Tyler W H; Dussault, Michael; Petzold, Christopher J; Baidoo, Edward E K; Costello, Zak; Ando, David; Alonso-Gutierrez, Jorge; George, Kevin W; Mukhopadhyay, Aindrila; Vaino, Ian; Keasling, Jay D; Adams, Paul D; Hillson, Nathan J; Garcia Martin, Hector
2017-12-15
Although recent advances in synthetic biology allow us to produce biological designs more efficiently than ever, our ability to predict the end result of these designs is still nascent. Predictive models require large amounts of high-quality data to be parametrized and tested, which are not generally available. Here, we present the Experiment Data Depot (EDD), an online tool designed as a repository of experimental data and metadata. EDD provides a convenient way to upload a variety of data types, visualize these data, and export them in a standardized fashion for use with predictive algorithms. In this paper, we describe EDD and showcase its utility for three different use cases: storage of characterized synthetic biology parts, leveraging proteomics data to improve biofuel yield, and the use of extracellular metabolite concentrations to predict intracellular metabolic fluxes.
17 CFR 31.8 - Cover of leverage contracts.
Code of Federal Regulations, 2010 CFR
2010-04-01
... contracts entered into with leverage customers, and must at all times also maintain cover of at least 90... leverage customers. At least 25 percent of the amount of physical commodities subject to open long leverage... entered into with leverage customers: And, provided further, That the leverage transaction merchant...
NASA Technical Reports Server (NTRS)
Oza, Nikunj C.
2004-01-01
Ensemble Data Mining Methods, also known as Committee Methods or Model Combiners, are machine learning methods that leverage the power of multiple models to achieve better prediction accuracy than any of the individual models could on their own. The basic goal when designing an ensemble is the same as when establishing a committee of people: each member of the committee should be as competent as possible, but the members should be complementary to one another. If the members are not complementary, Le., if they always agree, then the committee is unnecessary---any one member is sufficient. If the members are complementary, then when one or a few members make an error, the probability is high that the remaining members can correct this error. Research in ensemble methods has largely revolved around designing ensembles consisting of competent yet complementary models.
Creation of a Rapid High-Fidelity Aerodynamics Module for a Multidisciplinary Design Environment
NASA Technical Reports Server (NTRS)
Srinivasan, Muktha; Whittecar, William; Edwards, Stephen; Mavris, Dimitri N.
2012-01-01
In the traditional aerospace vehicle design process, each successive design phase is accompanied by an increment in the modeling fidelity of the disciplinary analyses being performed. This trend follows a corresponding shrinking of the design space as more and more design decisions are locked in. The correlated increase in knowledge about the design and decrease in design freedom occurs partly because increases in modeling fidelity are usually accompanied by significant increases in the computational expense of performing the analyses. When running high fidelity analyses, it is not usually feasible to explore a large number of variations, and so design space exploration is reserved for conceptual design, and higher fidelity analyses are run only once a specific point design has been selected to carry forward. The designs produced by this traditional process have been recognized as being limited by the uncertainty that is present early on due to the use of lower fidelity analyses. For example, uncertainty in aerodynamics predictions produces uncertainty in trajectory optimization, which can impact overall vehicle sizing. This effect can become more significant when trajectories are being shaped by active constraints. For example, if an optimal trajectory is running up against a normal load factor constraint, inaccuracies in the aerodynamic coefficient predictions can cause a feasible trajectory to be considered infeasible, or vice versa. For this reason, a trade must always be performed between the desired fidelity and the resources available. Apart from this trade between fidelity and computational expense, it is very desirable to use higher fidelity analyses earlier in the design process. A large body of work has been performed to this end, led by efforts in the area of surrogate modeling. In surrogate modeling, an up-front investment is made by running a high fidelity code over a Design of Experiments (DOE); once completed, the DOE data is used to create a surrogate model, which captures the relationships between input variables and responses into regression equations. Depending on the dimensionality of the problem and the fidelity of the code for which a surrogate model is being created, the initial DOE can itself be computationally prohibitive to run. Cokriging, a modeling approach from the field of geostatistics, provides a desirable compromise between computational expense and fidelity. To do this, cokriging leverages a large body of data generated by a low fidelity analysis, combines it with a smaller set of data from a higher fidelity analysis, and creates a kriging surrogate model with prediction fidelity approaching that of the higher fidelity analysis. When integrated into a multidisciplinary environment, a disciplinary analysis module employing cokriging can raise the analysis fidelity without drastically impacting the expense of design iterations. This is demonstrated through the creation of an aerodynamics analysis module in NASA s OpenMDAO framework. Aerodynamic analyses including Missile DATCOM, APAS, and USM3D are leveraged to create high fidelity aerodynamics decks for parametric vehicle geometries, which are created in NASA s Vehicle Sketch Pad (VSP). Several trade studies are performed to examine the achieved level of model fidelity, and the overall impact to vehicle design is quantified.
Amyris, Inc. Integrated Biorefinery Project Summary Final Report - Public Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, David; Sato, Suzanne; Garcia, Fernando
The Amyris pilot-scale Integrated Biorefinery (IBR) leveraged Amyris synthetic biology and process technology experience to upgrade Amyris’s existing Emeryville, California pilot plant and fermentation labs to enable development of US-based production capabilities for renewable diesel fuel and alternative chemical products. These products were derived semi-synthetically from high-impact biomass feedstocks via microbial fermentation to the 15-carbon intermediate farnesene, with subsequent chemical finishing to farnesane. The Amyris IBR team tested and provided methods for production of diesel and alternative chemical products from sweet sorghum, and other high-impact lignocellulosic feedstocks, at pilot scale. This enabled robust techno-economic analysis (TEA), regulatory approvals, and amore » basis for full-scale manufacturing processes and facility design.« less
NASA Technical Reports Server (NTRS)
Oliger, Joseph
1997-01-01
Topics considered include: high-performance computing; cognitive and perceptual prostheses (computational aids designed to leverage human abilities); autonomous systems. Also included: development of a 3D unstructured grid code based on a finite volume formulation and applied to the Navier-stokes equations; Cartesian grid methods for complex geometry; multigrid methods for solving elliptic problems on unstructured grids; algebraic non-overlapping domain decomposition methods for compressible fluid flow problems on unstructured meshes; numerical methods for the compressible navier-stokes equations with application to aerodynamic flows; research in aerodynamic shape optimization; S-HARP: a parallel dynamic spectral partitioner; numerical schemes for the Hamilton-Jacobi and level set equations on triangulated domains; application of high-order shock capturing schemes to direct simulation of turbulence; multicast technology; network testbeds; supercomputer consolidation project.
Providing the Missing Link: the Exposure Science Ontology ExO
Although knowledge-discovery tools are new to the exposure science community, these tools are critical for leveraging exposure information to design health studies and interpret results for improved public health decisions. Standardized ontologies define relationships, allow for ...
NASA Technical Reports Server (NTRS)
Benowitz, E.; Niessner, A.
2003-01-01
This work involves developing representative mission-critical spacecraft software using the Real-Time Specification for Java (RTSJ). This work currently leverages actual flight software used in the design of actual flight software in the NASA's Deep Space 1 (DSI), which flew in 1998.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrison, Cyrus; Larsen, Matt; Brugger, Eric
Strawman is a system designed to explore the in situ visualization and analysis needs of simulation code teams running multi-physics calculations on many-core HPC architectures. It porvides rendering pipelines that can leverage both many-core CPUs and GPUs to render images of simulation meshes.
Hybrid FSAE Vehicle Realization
DOT National Transportation Integrated Search
2010-12-01
The goal of this multi-year project is to create a fully functional University of Idaho entry in the hybrid FSAE competition. Vehicle integration is underway as part of a variety of 2010-11 senior design projects. This leverages a variety of analytic...
Multidisciplinary Design, Analysis, and Optimization Tool Development Using a Genetic Algorithm
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley
2009-01-01
Multidisciplinary design, analysis, and optimization using a genetic algorithm is being developed at the National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California) to automate analysis and design process by leveraging existing tools to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic, and hypersonic aircraft. This is a promising technology, but faces many challenges in large-scale, real-world application. This report describes current approaches, recent results, and challenges for multidisciplinary design, analysis, and optimization as demonstrated by experience with the Ikhana fire pod design.!
LEO to GEO (and Beyond) Transfers Using High Power Solar Electric Propulsion (HP-SEP)
NASA Technical Reports Server (NTRS)
Loghry, Christopher S.; Oleson, Steven R.; Woytach, Jeffrey M.; Martini, Michael C.; Smith, David A.; Fittje, James E.; Gyekenyesi, John Z.; Colozza, Anthony J.; Fincannon, James; Bogner, Aimee;
2017-01-01
Rideshare, or Multi-Payload launch configurations, are becoming more and more commonplace but access to space is only one part of the overall mission needs. The ability for payloads to achieve their target orbits or destinations can still be difficult and potentially not feasible with on-board propulsion limitations. The High Power Solar Electric Propulsion (HP-SEP) Orbital Maneuvering Vehicle (OMV) provides transfer capabilities for both large and small payload in excess of what is possible with chemical propulsion. Leveraging existing secondary payload adapter technology like the ESPA provides a platform to support Multi-Payload launch and missions. When coupled with HP-SEP, meaning greater than 30 kW system power, very large delta-V maneuvers can be accomplished. The HP-SEP OMV concept is designed to perform a Low Earth Orbit to Geosynchronous Orbit (LEO-GEO) transfer of up to six payloads each with 300kg mass. The OMV has enough capability to perform this 6 kms maneuver and have residual capacity to extend an additional transfer from GEO to Lunar orbit. This high deltaV capability is achieved using state of the art 12.5kW Hall Effect Thrusters (HET) coupled with high power roll up solar arrays. The HP-SEP OMV also provides a demonstration platform for other SEP technologies such as advanced Power Processing Units (PPU), Xenon Feed Systems (XFS), and other HET technologies. The HP-SEP OMV platform can be leveraged for other missions as well such as interplanetary science missions and applications for resilient space architectures.
ERIC Educational Resources Information Center
Gauthier, Andrea; Jenkinson, Jodie
2017-01-01
We designed a serious game, MolWorlds, to facilitate conceptual change about molecular emergence by using game mechanics (resource management, immersed 3rd person character, sequential level progression, and 3-star scoring system) to encourage cycles of productive negativity. We tested the value-added effect of game design by comparing and…
High gradient magnetic field microstructures for magnetophoretic cell separation.
Abdel Fattah, Abdel Rahman; Ghosh, Suvojit; Puri, Ishwar K
2016-08-01
Microfluidics has advanced magnetic blood fractionation by making integrated miniature devices possible. A ferromagnetic microstructure array that is integrated with a microfluidic channel rearranges an applied magnetic field to create a high gradient magnetic field (HGMF). By leveraging the differential magnetic susceptibilities of cell types contained in a host medium, such as paramagnetic red blood cells (RBCs) and diamagnetic white blood cells (WBCs), the resulting HGMF can be used to continuously separate them without attaching additional labels, such as magnetic beads, to them. We describe the effect of these ferromagnetic microstructure geometries have on the blood separation efficacy by numerically simulating the influence of microstructure height and pitch on the HGMF characteristics and resulting RBC separation. Visualizations of RBC trajectories provide insight into how arrays can be optimized to best separate these cells from a host fluid. Periodic microstructures are shown to moderate the applied field due to magnetic interference between the adjacent teeth of an array. Since continuous microstructures do not similarly weaken the resultant HGMF, they facilitate significantly higher RBC separation. Nevertheless, periodic arrays are more appropriate for relatively deep microchannels since, unlike continuous microstructures, their separation effectiveness is independent of depth. The results are relevant to the design of microfluidic devices that leverage HGMFs to fractionate blood by separating RBCs and WBCs. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Hu, Hui; Ning, Zhe
2016-11-01
Due to the auto-rotating trait of maple seeds during falling down process, flow characteristics of rotating maple seeds have been studied by many researchers in recent years. In the present study, an experimental investigation was performed to explore maple-seed-inspired UAV propellers for improved aerodynamic and aeroacoustic performances. Inspired by the auto-rotating trait of maple seeds, the shape of a maple seed is leveraged for the planform design of UAV propellers. The aerodynamic and aeroacoustic performances of the maple-seed-inspired propellers are examined in great details, in comparison with a commercially available UAV propeller purchased on the market (i.e., a baseline propeller). During the experiments, in addition to measuring the aerodynamic forces generated by the maple-seed-inspired propellers and the baseline propeller, a high-resolution Particle Image Velocimetry (PIV) system was used to quantify the unsteady flow structures in the wakes of the propellers. The aeroacoustic characteristics of the propellers are also evaluated by leveraging an anechoic chamber available at the Aerospace Engineering Department of Iowa State University. The research work is supported by National Science Foundation under Award Numbers of OSIE-1064235.
Human Health and Climate Change: Leverage Points for Adaptation in Urban Environments
Proust, Katrina; Newell, Barry; Brown, Helen; Capon, Anthony; Browne, Chris; Burton, Anthony; Dixon, Jane; Mu, Lisa; Zarafu, Monica
2012-01-01
The design of adaptation strategies that promote urban health and well-being in the face of climate change requires an understanding of the feedback interactions that take place between the dynamical state of a city, the health of its people, and the state of the planet. Complexity, contingency and uncertainty combine to impede the growth of such systemic understandings. In this paper we suggest that the collaborative development of conceptual models can help a group to identify potential leverage points for effective adaptation. We describe a three-step procedure that leads from the development of a high-level system template, through the selection of a problem space that contains one or more of the group’s adaptive challenges, to a specific conceptual model of a sub-system of importance to the group. This procedure is illustrated by a case study of urban dwellers’ maladaptive dependence on private motor vehicles. We conclude that a system dynamics approach, revolving around the collaborative construction of a set of conceptual models, can help communities to improve their adaptive capacity, and so better meet the challenge of maintaining, and even improving, urban health in the face of climate change. PMID:22829795
NASA Technical Reports Server (NTRS)
Mehling, Joshua S.; Holley, James; O'Malley, Marcia K.
2015-01-01
The fidelity with which series elastic actuators (SEAs) render desired impedances is important. Numerous approaches to SEA impedance control have been developed under the premise that high-precision actuator torque control is a prerequisite. Indeed, the design of an inner torque compensator has a significant impact on actuator impedance rendering. The disturbance observer (DOB) based torque control implemented in NASA's Valkyrie robot is considered here and a mathematical model of this torque control, cascaded with an outer impedance compensator, is constructed. While previous work has examined the impact a disturbance observer has on torque control performance, little has been done regarding DOBs and impedance rendering accuracy. Both simulation and a series of experiments are used to demonstrate the significant improvements possible in an SEA's ability to render desired dynamic behaviors when utilizing a DOB. Actuator transparency at low impedances is improved, closed loop hysteresis is reduced, and the actuator's dynamic response to both commands and interaction torques more faithfully matches that of the desired model. All of this is achieved by leveraging DOB based control rather than increasing compensator gains, thus making improved SEA impedance control easier to achieve in practice.
2013-12-01
Protective Equipment Sizing and Design ,” Human Factors: The Journal of the Human Factors and Ergonomics Society 55, no. 1 (2013): 6–35; Hsiao...firefighters. The information will be used to improve apparatus design , revise NFPA 1901 Standard for Automotive Fire Apparatus, and improve cab, seat ... Design .” Human Factors: The Journal of the Human Factors and Ergonomics Society 55, no. 1 (2013): 6–35. ———. Sizing Firefighters and Fire Apparatus
NASA Technical Reports Server (NTRS)
Boulanger, Richard; Overland, David
2004-01-01
Technologies that facilitate the design and control of complex, hybrid, and resource-constrained systems are examined. This paper focuses on design methodologies, and system architectures, not on specific control methods that may be applied to life support subsystems. Honeywell and Boeing have estimated that 60-80Y0 of the effort in developing complex control systems is software development, and only 20-40% is control system development. It has also been shown that large software projects have failure rates of as high as 50-65%. Concepts discussed include the Unified Modeling Language (UML) and design patterns with the goal of creating a self-improving, self-documenting system design process. Successful architectures for control must not only facilitate hardware to software integration, but must also reconcile continuously changing software with much less frequently changing hardware. These architectures rely on software modules or components to facilitate change. Architecting such systems for change leverages the interfaces between these modules or components.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fadika, Zacharia; Dede, Elif; Govindaraju, Madhusudhan
MapReduce is increasingly becoming a popular framework, and a potent programming model. The most popular open source implementation of MapReduce, Hadoop, is based on the Hadoop Distributed File System (HDFS). However, as HDFS is not POSIX compliant, it cannot be fully leveraged by applications running on a majority of existing HPC environments such as Teragrid and NERSC. These HPC environments typicallysupport globally shared file systems such as NFS and GPFS. On such resourceful HPC infrastructures, the use of Hadoop not only creates compatibility issues, but also affects overall performance due to the added overhead of the HDFS. This paper notmore » only presents a MapReduce implementation directly suitable for HPC environments, but also exposes the design choices for better performance gains in those settings. By leveraging inherent distributed file systems' functions, and abstracting them away from its MapReduce framework, MARIANE (MApReduce Implementation Adapted for HPC Environments) not only allows for the use of the model in an expanding number of HPCenvironments, but also allows for better performance in such settings. This paper shows the applicability and high performance of the MapReduce paradigm through MARIANE, an implementation designed for clustered and shared-disk file systems and as such not dedicated to a specific MapReduce solution. The paper identifies the components and trade-offs necessary for this model, and quantifies the performance gains exhibited by our approach in distributed environments over Apache Hadoop in a data intensive setting, on the Magellan testbed at the National Energy Research Scientific Computing Center (NERSC).« less
Anderson, Caitlin E; Holstein, Carly A; Strauch, Eva-Maria; Bennett, Steven; Chevalier, Aaron; Nelson, Jorgen; Fu, Elain; Baker, David; Yager, Paul
2017-06-20
Influenza is a ubiquitous and recurring infection that results in approximately 500 000 deaths globally each year. Commercially available rapid diagnostic tests are based upon detection of the influenza nucleoprotein, which are limited in that they are unable to differentiate by species and require an additional viral lysis step. Sample preprocessing can be minimized or eliminated by targeting the intact influenza virus, thereby reducing assay complexity and leveraging the large number of hemagglutinin proteins on the surface of each virus. Here, we report the development of a paper-based influenza assay that targets the hemagglutinin protein; the assay employs a combination of antibodies and novel computationally designed, recombinant affinity proteins as the capture and detection agents. This system leverages the customizability of recombinant protein design to target the conserved receptor-binding pocket of the hemagglutinin protein and to match the trimeric nature of hemagglutinin for improved avidity. Using this assay, we demonstrate the first instance of intact influenza virus detection using a combination of antibody and affinity proteins within a porous network. The recombinant head region binder based assays yield superior analytical sensitivity as compared to the antibody based assay, with lower limits of detection of 3.54 × 10 7 and 1.34 × 10 7 CEID 50 /mL for the mixed and all binder stacks, respectively. Not only does this work describe the development of a novel influenza assay, it also demonstrates the power of recombinant affinity proteins for use in rapid diagnostic assays.
Modeling Group Interactions via Open Data Sources
2011-08-30
data. The state-of-art search engines are designed to help general query-specific search and not suitable for finding disconnected online groups. The...groups, (2) developing innovative mathematical and statistical models and efficient algorithms that leverage existing search engines and employ
, training, and resource development for Federal government energy projects that leverage utility industry The design of technical training plans for sustained performance of energy conservation measures Advanced Utility Energy Services Contract Training, 2012, accredited by the International Association for
2011-09-01
AND EXPERIMENTAL DESIGN ..........................................................................................................31 1...PRIMARY RESERCH QUESTION ............................................................41 C. OBJECTIVE ACHIEVEMENT...Based Outpatient Clinic CPT Cognitive Processing Therapy DISE Distributed Information Systems Experimentation EBT Evidence-Based Treatment GMC
NASA Astrophysics Data System (ADS)
Mallick, Rajnish; Ganguli, Ranjan; Kumar, Ravi
2017-05-01
The optimized design of a smart post-buckled beam actuator (PBA) is performed in this study. A smart material based piezoceramic stack actuator is used as a prime-mover to drive the buckled beam actuator. Piezoceramic actuators are high force, small displacement devices; they possess high energy density and have high bandwidth. In this study, bench top experiments are conducted to investigate the angular tip deflections due to the PBA. A new design of a linear-to-linear motion amplification device (LX-4) is developed to circumvent the small displacement handicap of piezoceramic stack actuators. LX-4 enhances the piezoceramic actuator mechanical leverage by a factor of four. The PBA model is based on dynamic elastic stability and is analyzed using the Mathieu-Hill equation. A formal optimization is carried out using a newly developed meta-heuristic nature inspired algorithm, named as the bat algorithm (BA). The BA utilizes the echolocation capability of bats. An optimized PBA in conjunction with LX-4 generates end rotations of the order of 15° at the output end. The optimized PBA design incurs less weight and induces large end rotations, which will be useful in development of various mechanical and aerospace devices, such as helicopter trailing edge flaps, micro and nano aerial vehicles and other robotic systems.
Leveraging prior quantitative knowledge in guiding pediatric drug development: a case study.
Jadhav, Pravin R; Zhang, Jialu; Gobburu, Jogarao V S
2009-01-01
The manuscript presents the FDA's focus on leveraging prior knowledge in designing informative pediatric trial through this case study. In developing written request for Drug X, an anti-hypertensive for immediate blood pressure (BP) control, the sponsor and FDA conducted clinical trial simulations (CTS) to design trial with proper sample size and support the choice of dose range. The objective was to effectively use prior knowledge from adult patients for drug X, pediatric data from Corlopam (approved for a similar indication) trial and general experience in developing anti-hypertensive agents. Different scenarios governing the exposure response relationship in the pediatric population were simulated to perturb model assumptions. The choice of scenarios was based on the past observation that pediatric population is less responsive and sensitive compared with adults. The conceptual framework presented here should serve as an example on how the industry and FDA scientists can collaborate in designing the pediatric exclusivity trial. Using CTS, inter-disciplinary scientists with the sponsor and FDA can objectively discuss the choice of dose range, sample size, endpoints and other design elements. These efforts are believed to yield plausible trial design, qrational dosing recommendations and useful labeling information in pediatrics. Published in 2009 by John Wiley & Sons, Ltd.
Leveraging FPGAs for Accelerating Short Read Alignment.
Arram, James; Kaplan, Thomas; Luk, Wayne; Jiang, Peiyong
2017-01-01
One of the key challenges facing genomics today is how to efficiently analyze the massive amounts of data produced by next-generation sequencing platforms. With general-purpose computing systems struggling to address this challenge, specialized processors such as the Field-Programmable Gate Array (FPGA) are receiving growing interest. The means by which to leverage this technology for accelerating genomic data analysis is however largely unexplored. In this paper, we present a runtime reconfigurable architecture for accelerating short read alignment using FPGAs. This architecture exploits the reconfigurability of FPGAs to allow the development of fast yet flexible alignment designs. We apply this architecture to develop an alignment design which supports exact and approximate alignment with up to two mismatches. Our design is based on the FM-index, with optimizations to improve the alignment performance. In particular, the n-step FM-index, index oversampling, a seed-and-compare stage, and bi-directional backtracking are included. Our design is implemented and evaluated on a 1U Maxeler MPC-X2000 dataflow node with eight Altera Stratix-V FPGAs. Measurements show that our design is 28 times faster than Bowtie2 running with 16 threads on dual Intel Xeon E5-2640 CPUs, and nine times faster than Soap3-dp running on an NVIDIA Tesla C2070 GPU.
Amanzi: An Open-Source Multi-process Simulator for Environmental Applications
NASA Astrophysics Data System (ADS)
Moulton, J. D.; Molins, S.; Johnson, J. N.; Coon, E.; Lipnikov, K.; Day, M.; Barker, E.
2014-12-01
The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments begin with simplified models, and add geometric and geologic complexity as understanding is gained. The Platform toolsets (Akuna) generates these conceptual models and Amanzi provides the computational engine to perform the simulations, returning the results for analysis and visualization. In this presentation we highlight key elements of the design, algorithms and implementations used in Amanzi. In particular, the hierarchical and modular design is aligned with the coupled processes being sumulated, and naturally supports a wide range of model complexity. This design leverages a dynamic data manager and the synergy of two graphs (one from the high-level perspective of the models the other from the dependencies of the variables in the model) to enable this flexible model configuration at run time. Moreover, to model sites with complex hydrostratigraphy, as well as engineered systems, we are developing a dual unstructured/structured capability. Recently, these capabilities have been collected in a framework named Arcos, and efforts have begun to improve interoperability between the unstructured and structured AMR approaches in Amanzi. To leverage a range of biogeochemistry capability from the community (e.g., CrunchFlow, PFLOTRAN, etc.), a biogeochemistry interface library was developed called Alquimia. To ensure that Amanzi is truly an open-source community code we require a completely open-source tool chain for our development. We will comment on elements of this tool chain, including the testing and documentation development tools such as docutils, and Sphinx. Finally, we will show simulation results from our phased demonstrations, including the geochemically complex Savannah River F-Area seepage basins.
Developing Web-based Tools for Collaborative Science and Public Outreach
NASA Astrophysics Data System (ADS)
Friedman, A.; Pizarro, O.; Williams, S. B.
2016-02-01
With the advances in high bandwidth communications and the proliferation of social media tools, education & outreach activities have become commonplace on ocean-bound research cruises. In parallel, advances in underwater robotics & other data collecting platforms, have made it possible to collect copious amounts of oceanographic data. This data then typically undergoes laborious, manual processing to transform it into quantitative information, which normally occurs post cruise resulting in significant lags between collecting data and using it for scientific discovery. This presentation discusses how appropriately designed software systems, can be used to fulfill multiple objectives and attempt to leverage public engagement in order to compliment science goals. We will present two software platforms: the first is a web browser based tool that was developed for real-time tracking of multiple underwater robots and ships. It was designed to allow anyone on board to view or control it on any device with a web browser. It opens up the possibility of remote teleoperation & engagement and was easily adapted to enable live streaming over the internet for public outreach. While the tracking system provided context and engaged people in real-time, it also directed interested participants to Squidle, another online system. Developed for scientists, Squidle supports data management, exploration & analysis and enables direct access to survey data reducing the lag in data processing. It provides a user-friendly streamlined interface that integrates advanced data management & online annotation tools. This system was adapted to provide a simplified user interface, tutorial instructions and a gamified ranking system to encourage "citizen science" participation. These examples show that through a flexible design approach, it is possible to leverage the development effort of creating science tools to facilitate outreach goals, opening up the possibility for acquiring large volumes of crowd-sourced data without compromising science objectives.
Measuring the Internal Environment of Solid Rocket Motors During Ignition
NASA Technical Reports Server (NTRS)
Weisenberg, Brent; Smith, Doug; Speas, Kyle; Corliss, Adam
2003-01-01
A new instrumentation system has been developed to measure the internal environment of solid rocket test motors during motor ignition. The system leverages conventional, analog gages with custom designed, electronics modules to provide safe, accurate, high speed data acquisition capability. To date, the instrumentation system has been demonstrated in a laboratory environment and on subscale static fire test motors ranging in size from 5-inches to 24-inches in diameter. Ultimately, this system is intended to be installed on a full-scale Reusable Solid Rocket Motor. This paper explains the need for the data, the components and capabilities of the system, and the test results.
Photonic Doppler Velocimetry Multiplexing Techniques: Evaluation of Photonic Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edward Daykin
This poster reports progress related to photonic technologies. Specifically, the authors developed diagnostic system architecture for a Multiplexed Photonic Doppler Velocimetry (MPDV) that incorporates frequency and time-division multiplexing into existing PDV methodology to provide increased channel count. Current MPDV design increases number of data records per digitizer channel 8x, and also operates as a laser-safe (Class 3a) system. Further, they applied heterodyne interferometry to allow for direction-of-travel determination and enable high-velocity measurements (>10 km/s) via optical downshifting. They also leveraged commercially available, inexpensive and robust components originally developed for telecom applications. Proposed MPDV architectures employ only commercially available, fiber-coupled hardware.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ives, Robert Lawrence; Marsden, David; Collins, George
Calabazas Creek Research, Inc. developed a 1.5 MW RF load for the ITER fusion research facility currently under construction in France. This program leveraged technology developed in two previous SBIR programs that successfully developed high power RF loads for fusion research applications. This program specifically focused on modifications required by revised technical performance, materials, and assembly specification for ITER. This program implemented an innovative approach to actively distribute the RF power inside the load to avoid excessive heating or arcing associated with constructive interference. The new design implemented materials and assembly changes required to meet specifications. Critical components were builtmore » and successfully tested during the program.« less
SNL Mechanical Computer Aided Design (MCAD) guide 2007.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Brandon; Pollice, Stephanie L.; Martinez, Jack R.
2007-12-01
This document is considered a mechanical design best-practice guide to new and experienced designers alike. The contents consist of topics related to using Computer Aided Design (CAD) software, performing basic analyses, and using configuration management. The details specific to a particular topic have been leveraged against existing Product Realization Standard (PRS) and Technical Business Practice (TBP) requirements while maintaining alignment with sound engineering and design practices. This document is to be considered dynamic in that subsequent updates will be reflected in the main title, and each update will be published on an annual basis.
76 FR 76977 - Proposed Information Collection Activity; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-09
... Program LIHEAP Leveraging Report. OMB No.: 0970-0121. Description: The LIHEAP leveraging incentive program rewards LIHEAP grantees that have leveraged non-federal home energy resources for low- income households. The LIHEAP leveraging report is the application for leveraging incentive funds that these LIHEAP...
Code of Federal Regulations, 2010 CFR
2010-04-01
... a margin deficiency without effecting personal contact with the leverage customer. If a leverage transaction merchant is unable to effect personal contact with a leverage customer, a telegram sent to the leverage customer at the address furnished by the customer to the leverage transaction merchant shall be...
76 FR 10366 - Agency Information Collection Request; 60-Day Public Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-24
...). Abstract: This research leverages best practices in behavior change, interaction design, and service innovation to increase the understanding and adoption of Comparative Effectiveness Research (CER) information..., translation, and adoption of evidence-based, outcomes- oriented CER findings. Comparative Effectiveness...
50 CFR 84.32 - What are the ranking criteria?
Code of Federal Regulations, 2013 CFR
2013-10-01
... improvements to the quality of the coastal wetland and associated waters through protection from contaminants... project proposal designed to leverage other ongoing coastal wetlands protection projects in the area, such... (CONTINUED) FINANCIAL ASSISTANCE-WILDLIFE AND SPORT FISH RESTORATION PROGRAM NATIONAL COASTAL WETLANDS...
50 CFR 84.32 - What are the ranking criteria?
Code of Federal Regulations, 2014 CFR
2014-10-01
... improvements to the quality of the coastal wetland and associated waters through protection from contaminants... project proposal designed to leverage other ongoing coastal wetlands protection projects in the area, such... (CONTINUED) FINANCIAL ASSISTANCE-WILDLIFE AND SPORT FISH RESTORATION PROGRAM NATIONAL COASTAL WETLANDS...
50 CFR 84.32 - What are the ranking criteria?
Code of Federal Regulations, 2010 CFR
2010-10-01
... improvements to the quality of the coastal wetland and associated waters through protection from contaminants... project proposal designed to leverage other ongoing coastal wetlands protection projects in the area, such... (CONTINUED) FINANCIAL ASSISTANCE-WILDLIFE SPORT FISH RESTORATION PROGRAM NATIONAL COASTAL WETLANDS...
50 CFR 84.32 - What are the ranking criteria?
Code of Federal Regulations, 2012 CFR
2012-10-01
... improvements to the quality of the coastal wetland and associated waters through protection from contaminants... project proposal designed to leverage other ongoing coastal wetlands protection projects in the area, such... (CONTINUED) FINANCIAL ASSISTANCE-WILDLIFE SPORT FISH RESTORATION PROGRAM NATIONAL COASTAL WETLANDS...
50 CFR 84.32 - What are the ranking criteria?
Code of Federal Regulations, 2011 CFR
2011-10-01
... improvements to the quality of the coastal wetland and associated waters through protection from contaminants... project proposal designed to leverage other ongoing coastal wetlands protection projects in the area, such... (CONTINUED) FINANCIAL ASSISTANCE-WILDLIFE SPORT FISH RESTORATION PROGRAM NATIONAL COASTAL WETLANDS...
Leveraging Information Technology. Track IV: Support Services.
ERIC Educational Resources Information Center
CAUSE, Boulder, CO.
Seven papers from the 1987 CAUSE conference's Track IV, Support Services, are presented. They include: "Application Development Center" (John F. Leydon); "College Information Management System: The Design and Implementation of a Completely Integrated Office Automation and Student Information System" (Karen L. Miselis);…
Breakthroughs in Low Profile Leaky Wave HPM Antennas
2016-10-17
3D RF modeling, but the design time and effort will be greatly reduced compared to starting from scratch. The LWAs featured here exhibit beam...Section 4 present related and novel antenna designs that leverage some of the concepts from this research program. Section 5 and Section 6 present...parameters that we used previously for the wire-grill design in Figure 3, but this time with the intent to combine it with an acrylic (εr=2.55) window of
Consortium for Robotics & Unmanned Systems Education & Research (CRUSER)
2012-09-30
as facilities at Camp Roberts, Calif. and frequent experimentation events, the Many vs. Many ( MvM ) Autonomous Systems Testbed provides the...and expediently translate theory to practice. The MvM Testbed is designed to integrate technological advances in hardware (inexpensive, expendable...designed to leverage the MvM Autonomous Systems Testbed to explore practical and operationally relevant avenues to counter these “swarm” opponents, and
ERIC Educational Resources Information Center
Research For Action, 2014
2014-01-01
Funded by The Bill & Melinda Gates Foundation, the Literacy Design Collaborative (LDC) and Math Design Collaborative (MDC) offer a set of instructional and formative assessment tools in literacy and math, which were developed to help educators better prepare all students to meet the Common Core State Standards (CCSS) and succeed beyond high…
13 CFR 108.1100 - Type of Leverage and application procedures.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Type of Leverage and application... MARKETS VENTURE CAPITAL (âNMVCâ) PROGRAM SBA Financial Assistance for NMVC Companies (Leverage) General Information About Obtaining Leverage § 108.1100 Type of Leverage and application procedures. (a) Type of...
RXIO: Design and implementation of high performance RDMA-capable GridFTP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tian, Yuan; Yu, Weikuan; Vetter, Jeffrey S.
2011-12-21
For its low-latency, high bandwidth, and low CPU utilization, Remote Direct Memory Access (RDMA) has established itself as an effective data movement technology in many networking environments. However, the transport protocols of grid run-time systems, such as GridFTP in Globus, are not yet capable of utilizing RDMA. In this study, we examine the architecture of GridFTP for the feasibility of enabling RDMA. An RDMA-capable XIO (RXIO) framework is designed and implemented to extend its XIO system and match the characteristics of RDMA. Our experimental results demonstrate that RDMA can significantly improve the performance of GridFTP, reducing the latency by 32%more » and increasing the bandwidth by more than three times. In achieving such performance improvements, RDMA dramatically cuts down CPU utilization of GridFTP clients and servers. In conclusion, these results demonstrate that RXIO can effectively exploit the benefits of RDMA for GridFTP. It offers a good prototype to further leverage GridFTP on wide-area RDMA networks.« less
Ultra-thin high-efficiency mid-infrared transmissive Huygens meta-optics.
Zhang, Li; Ding, Jun; Zheng, Hanyu; An, Sensong; Lin, Hongtao; Zheng, Bowen; Du, Qingyang; Yin, Gufan; Michon, Jerome; Zhang, Yifei; Fang, Zhuoran; Shalaginov, Mikhail Y; Deng, Longjiang; Gu, Tian; Zhang, Hualiang; Hu, Juejun
2018-04-16
The mid-infrared (mid-IR) is a strategically important band for numerous applications ranging from night vision to biochemical sensing. Here we theoretically analyzed and experimentally realized a Huygens metasurface platform capable of fulfilling a diverse cross-section of optical functions in the mid-IR. The meta-optical elements were constructed using high-index chalcogenide films deposited on fluoride substrates: the choices of wide-band transparent materials allow the design to be scaled across a broad infrared spectrum. Capitalizing on a two-component Huygens' meta-atom design, the meta-optical devices feature an ultra-thin profile (λ 0 /8 in thickness) and measured optical efficiencies up to 75% in transmissive mode for linearly polarized light, representing major improvements over state-of-the-art. We have also demonstrated mid-IR transmissive meta-lenses with diffraction-limited focusing and imaging performance. The projected size, weight and power advantages, coupled with the manufacturing scalability leveraging standard microfabrication technologies, make the Huygens meta-optical devices promising for next-generation mid-IR system applications.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems
Mahadevan, Vijay S.; Merzari, Elia; Tautges, Timothy; ...
2014-06-30
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in ordermore » to reduce the overall numerical uncertainty while leveraging available computational resources. Finally, the coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework.« less
CFD Analysis of Emissions for a Candidate N+3 Combustor
NASA Technical Reports Server (NTRS)
Ajmani, Kumud
2015-01-01
An effort was undertaken to analyze the performance of a model Lean-Direct Injection (LDI) combustor designed to meet emissions and performance goals for NASA's N+3 program. Computational predictions of Emissions Index (EINOx) and combustor exit temperature were obtained for operation at typical power conditions expected of a small-core, high pressure-ratio (greater than 50), high T3 inlet temperature (greater than 950K) N+3 combustor. Reacting-flow computations were performed with the National Combustion Code (NCC) for a model N+3 LDI combustor, which consisted of a nine-element LDI flame-tube derived from a previous generation (N+2) thirteen-element LDI design. A consistent approach to mesh-optimization, spraymodeling and kinetics-modeling was used, in order to leverage the lessons learned from previous N+2 flame-tube analysis with the NCC. The NCC predictions for the current, non-optimized N+3 combustor operating indicated a 74% increase in NOx emissions as compared to that of the emissions-optimized, parent N+2 LDI combustor.
CFD Analysis of Emissions for a Candidate N+3 Combustor
NASA Technical Reports Server (NTRS)
Ajmani, Kumud
2015-01-01
An effort was undertaken to analyze the performance of a model Lean-Direct Injection (LDI) combustor designed to meet emissions and performance goals for NASA's N+3 program. Computational predictions of Emissions Index (EINOx) and combustor exit temperature were obtained for operation at typical power conditions expected of a small-core, high pressure-ratio (greater than 50), high T3 inlet temperature (greater than 950K) N+3 combustor. Reacting-flow computations were performed with the National Combustion Code (NCC) for a model N+3 LDI combustor, which consisted of a nine-element LDI flame-tube derived from a previous generation (N+2) thirteen-element LDI design. A consistent approach to mesh-optimization, spray-modeling and kinetics-modeling was used, in order to leverage the lessons learned from previous N+2 flame-tube analysis with the NCC. The NCC predictions for the current, non-optimized N+3 combustor operating indicated a 74% increase in NOx emissions as compared to that of the emissions-optimized, parent N+2 LDI combustor.
Virtualized Multi-Mission Operations Center (vMMOC) and its Cloud Services
NASA Technical Reports Server (NTRS)
Ido, Haisam Kassim
2017-01-01
His presentation will cover, the current and future, technical and organizational opportunities and challenges with virtualizing a multi-mission operations center. The full deployment of Goddard Space Flight Centers (GSFC) Virtualized Multi-Mission Operations Center (vMMOC) is nearly complete. The Space Science Mission Operations (SSMO) organizations spacecraft ACE, Fermi, LRO, MMS(4), OSIRIS-REx, SDO, SOHO, Swift, and Wind are in the process of being fully migrated to the vMMOC. The benefits of the vMMOC will be the normalization and the standardization of IT services, mission operations, maintenance, and development as well as ancillary services and policies such as collaboration tools, change management systems, and IT Security. The vMMOC will also provide operational efficiencies regarding hardware, IT domain expertise, training, maintenance and support.The presentation will also cover SSMO's secure Situational Awareness Dashboard in an integrated, fleet centric, cloud based web services fashion. Additionally the SSMO Telemetry as a Service (TaaS) will be covered, which allows authorized users and processes to access telemetry for the entire SSMO fleet, and for the entirety of each spacecrafts history. Both services leverage cloud services in a secure FISMA High and FedRamp environment, and also leverage distributed object stores in order to house and provide the telemetry. The services are also in the process of leveraging the cloud computing services elasticity and horizontal scalability. In the design phase is the Navigation as a Service (NaaS) which will provide a standardized, efficient, and normalized service for the fleet's space flight dynamics operations. Additional future services that may be considered are Ground Segment as a Service (GSaaS), Telemetry and Command as a Service (TCaaS), Flight Software Simulation as a Service, etc.
Jackson, Rebecca D; Best, Thomas M; Borlawsky, Tara B; Lai, Albert M; James, Stephen; Gurcan, Metin N
2012-01-01
The conduct of clinical and translational research regularly involves the use of a variety of heterogeneous and large-scale data resources. Scalable methods for the integrative analysis of such resources, particularly when attempting to leverage computable domain knowledge in order to generate actionable hypotheses in a high-throughput manner, remain an open area of research. In this report, we describe both a generalizable design pattern for such integrative knowledge-anchored hypothesis discovery operations and our experience in applying that design pattern in the experimental context of a set of driving research questions related to the publicly available Osteoarthritis Initiative data repository. We believe that this ‘test bed’ project and the lessons learned during its execution are both generalizable and representative of common clinical and translational research paradigms. PMID:22647689
Accelerating k-NN Algorithm with Hybrid MPI and OpenSHMEM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Jian; Hamidouche, Khaled; Zheng, Jie
2015-08-05
Machine Learning algorithms are benefiting from the continuous improvement of programming models, including MPI, MapReduce and PGAS. k-Nearest Neighbors (k-NN) algorithm is a widely used machine learning algorithm, applied to supervised learning tasks such as classification. Several parallel implementations of k-NN have been proposed in the literature and practice. However, on high-performance computing systems with high-speed interconnects, it is important to further accelerate existing designs of the k-NN algorithm through taking advantage of scalable programming models. To improve the performance of k-NN on large-scale environment with InfiniBand network, this paper proposes several alternative hybrid MPI+OpenSHMEM designs and performs a systemicmore » evaluation and analysis on typical workloads. The hybrid designs leverage the one-sided memory access to better overlap communication with computation than the existing pure MPI design, and propose better schemes for efficient buffer management. The implementation based on k-NN program from MaTEx with MVAPICH2-X (Unified MPI+PGAS Communication Runtime over InfiniBand) shows up to 9.0% time reduction for training KDD Cup 2010 workload over 512 cores, and 27.6% time reduction for small workload with balanced communication and computation. Experiments of running with varied number of cores show that our design can maintain good scalability.« less
Engaging Older Adult Volunteers in National Service
ERIC Educational Resources Information Center
McBride, Amanda Moore; Greenfield, Jennifer C.; Morrow-Howell, Nancy; Lee, Yung Soo; McCrary, Stacey
2012-01-01
Volunteer-based programs are increasingly designed as interventions to affect the volunteers and the beneficiaries of the volunteers' activities. To achieve the intended impacts for both, programs need to leverage the volunteers' engagement by meeting their expectations, retaining them, and maximizing their perceptions of benefits. Programmatic…
A simple analytical model for dynamics of time-varying target leverage ratios
NASA Astrophysics Data System (ADS)
Lo, C. F.; Hui, C. H.
2012-03-01
In this paper we have formulated a simple theoretical model for the dynamics of the time-varying target leverage ratio of a firm under some assumptions based upon empirical observations. In our theoretical model the time evolution of the target leverage ratio of a firm can be derived self-consistently from a set of coupled Ito's stochastic differential equations governing the leverage ratios of an ensemble of firms by the nonlinear Fokker-Planck equation approach. The theoretically derived time paths of the target leverage ratio bear great resemblance to those used in the time-dependent stationary-leverage (TDSL) model [Hui et al., Int. Rev. Financ. Analy. 15, 220 (2006)]. Thus, our simple model is able to provide a theoretical foundation for the selected time paths of the target leverage ratio in the TDSL model. We also examine how the pace of the adjustment of a firm's target ratio, the volatility of the leverage ratio and the current leverage ratio affect the dynamics of the time-varying target leverage ratio. Hence, with the proposed dynamics of the time-dependent target leverage ratio, the TDSL model can be readily applied to generate the default probabilities of individual firms and to assess the default risk of the firms.
Neutron Characterization for Additive Manufacturing
NASA Technical Reports Server (NTRS)
Watkins, Thomas; Bilheux, Hassina; An, Ke; Payzant, Andrew; DeHoff, Ryan; Duty, Chad; Peter, William; Blue, Craig; Brice, Craig A.
2013-01-01
Oak Ridge National Laboratory (ORNL) is leveraging decades of experience in neutron characterization of advanced materials together with resources such as the Spallation Neutron Source (SNS) and the High Flux Isotope Reactor (HFIR) shown in Fig. 1 to solve challenging problems in additive manufacturing (AM). Additive manufacturing, or three-dimensional (3-D) printing, is a rapidly maturing technology wherein components are built by selectively adding feedstock material at locations specified by a computer model. The majority of these technologies use thermally driven phase change mechanisms to convert the feedstock into functioning material. As the molten material cools and solidifies, the component is subjected to significant thermal gradients, generating significant internal stresses throughout the part (Fig. 2). As layers are added, inherent residual stresses cause warping and distortions that lead to geometrical differences between the final part and the original computer generated design. This effect also limits geometries that can be fabricated using AM, such as thin-walled, high-aspect- ratio, and overhanging structures. Distortion may be minimized by intelligent toolpath planning or strategic placement of support structures, but these approaches are not well understood and often "Edisonian" in nature. Residual stresses can also impact component performance during operation. For example, in a thermally cycled environment such as a high-pressure turbine engine, residual stresses can cause components to distort unpredictably. Different thermal treatments on as-fabricated AM components have been used to minimize residual stress, but components still retain a nonhomogeneous stress state and/or demonstrate a relaxation-derived geometric distortion. Industry, federal laboratory, and university collaboration is needed to address these challenges and enable the U.S. to compete in the global market. Work is currently being conducted on AM technologies at the ORNL Manufacturing Demonstration Facility (MDF) sponsored by the DOE's Advanced Manufacturing Office. The MDF is focusing on R&D of both metal and polymer AM pertaining to in-situ process monitoring and closed-loop controls; implementation of advanced materials in AM technologies; and demonstration, characterization, and optimization of next-generation technologies. ORNL is working directly with industry partners to leverage world-leading facilities in fields such as high performance computing, advanced materials characterization, and neutron sciences to solve fundamental challenges in advanced manufacturing. Specifically, MDF is leveraging two of the world's most advanced neutron facilities, the HFIR and SNS, to characterize additive manufactured components.
Design-for-Six-Sigma To Develop a Bioprocess Knowledge Management Framework.
Junker, Beth; Maheshwari, Gargi; Ranheim, Todd; Altaras, Nedim; Stankevicz, Michael; Harmon, Lori; Rios, Sandra; D'anjou, Marc
2011-01-01
Owing to the high costs associated with biopharmaceutical development, considerable pressure has developed for the biopharmaceutical industry to increase productivity by becoming more lean and flexible. The ability to reuse knowledge was identified as one key advantage to streamline productivity, efficiently use resources, and ultimately perform better than the competition. A knowledge management (KM) strategy was assembled for bioprocess-related information using the technique of Design-for-Six-Sigma (DFSS). This strategy supported quality-by-design and process validation efforts for pipeline as well as licensed products. The DFSS technique was selected because it was both streamlined and efficient. These characteristics permitted development of a KM strategy with minimized team leader and team member resources. DFSS also placed a high emphasis on the voice of the customer, information considered crucial to the selection of solutions most appropriate for the current knowledge-based challenges of the organization. The KM strategy developed was comprised of nine workstreams, constructed from related solution buckets which in turn were assembled from the individual solution tasks that were identified. Each workstream's detailed design was evaluated against published and established best practices, as well as the KM strategy project charter and design inputs. Gaps and risks were identified and mitigated as necessary to improve the robustness of the proposed strategy. Aggregated resources (specifically expense/capital funds and staff) and timing were estimated to obtain vital management sponsorship for implementation. Where possible, existing governance and divisional/corporate information technology efforts were leveraged to minimize the additional bioprocess resources required for implementation. Finally, leading and lagging indicator metrics were selected to track the success of pilots and eventual implementation. A knowledge management framework was assembled for bioprocess-related information using a streamlined and efficient technique that minimized team leader and member resources. The technique also highly emphasized input from the staff, who generated and used the knowledge, information considered crucial to selection of solutions most appropriate for the current knowledge-based challenges in the organization. The framework developed was comprised of nine workstreams, constructed from related solution buckets which were assembled from individual solution tasks that were identified. Each workstream's detailed design was evaluated against published and established best practices, as well as the project charter and design inputs. Gaps and risks were identified and mitigated to improve robustness of the proposed framework. Aggregated resources (specifically expense/capital funds and staff) and timing were estimated to obtain vital management sponsorship for implementation. Where possible, existing governance and information technology efforts were leveraged to minimize additional bioprocess resources required for implementation. Finally, metrics were selected to track the success of pilots and eventual implementation.
Design and implementation of population-based specialty care programs.
Botts, Sheila R; Gee, Michael T; Chang, Christopher C; Young, Iris; Saito, Logan; Lyman, Alfred E
2017-09-15
The development, implementation, and scaling of 3 population-based specialty care programs in a large integrated healthcare system are reviewed, and the role of clinical pharmacy services in ensuring safe, effective, and affordable care is highlighted. The Kaiser Permanente (KP) integrated healthcare delivery model allows for rapid development and expansion of innovative population management programs involving pharmacy services. Clinical pharmacists have assumed integral roles in improving the safety and effectiveness of high-complexity, high-cost care for specialty populations. These roles require an appropriate practice scope and are supported by an advanced electronic health record with disease registries and electronic surveillance tools for care-gap identification. The 3 specialty population programs described were implemented to address variation or unrecognized gaps in care for at-risk specialty populations. The Home Phototherapy Program has leveraged internal partnerships with clinical pharmacists to improve access to cost-effective nonpharmacologic interventions for psoriasis and other skin disorders. The Multiple Sclerosis Care Program has incorporated clinical pharmacists into neurology care in order to apply clinical guidelines in a systematic manner. The KP SureNet program has used clinical pharmacists and data analytics to identify opportunities to prevent drug-related adverse outcomes and ensure timely follow-up. Specialty care programs improve quality, cost outcomes, and the patient experience by appropriating resources to provide systematic and targeted care to high-risk patients. KP leverages an integration of people, processes, and technology to develop and scale population-based specialty care. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Haar, Sébastien; Ciesielski, Artur; Clough, Joseph; Yang, Huafeng; Mazzaro, Raffaello; Richard, Fanny; Conti, Simone; Merstorf, Nicolas; Cecchini, Marco; Morandi, Vittorio; Casiraghi, Cinzia; Samorì, Paolo
2015-04-08
Achieving the full control over the production as well as processability of high-quality graphene represents a major challenge with potential interest in the field of fabrication of multifunctional devices. The outstanding effort dedicated to tackle this challenge in the last decade revealed that certain organic molecules are capable of leveraging the exfoliation of graphite with different efficiencies. Here, a fundamental understanding on a straightforward supramolecular approach for producing homogenous dispersions of unfunctionalized and non-oxidized graphene nanosheets in four different solvents is attained, namely N-methyl-2-pyrrolidinone, N,N-dimethylformamide, ortho-dichlorobenzene, and 1,2,4-trichlorobenzene. In particular, a comparative study on the liquid-phase exfoliation of graphene in the presence of linear alkanes of different lengths terminated by a carboxylic-acid head group is performed. These molecules act as graphene dispersion-stabilizing agents during the exfoliation process. The efficiency of the exfoliation in terms of concentration of exfoliated graphene is found to be proportional to the length of the employed fatty acid. Importantly, a high percentage of single-layer graphene flakes is revealed by high-resolution transmission electron microscopy and Raman spectroscopy analyses. A simple yet effective thermodynamic model is developed to interpret the chain-length dependence of the exfoliation yield. This approach relying on the synergistic effect of a ad-hoc solvent and molecules to promote the exfoliation of graphene in liquid media represents a promising and modular strategy towards the rational design of improved dispersion-stabilizing agents. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
2017-01-01
aircraft types. Like the project scorecard, well- designed mission outcome metrics are concrete and relatable: sorties generated, pilots graduated...systems are designed to serve functions like this. To the degree that these data can be exported and synthesized, they can be leveraged to tie to a...expectations of engineers who designed a facility, as well as end users who utilize it. A notional depiction of how asset condition changes over
NASA Astrophysics Data System (ADS)
Biles, Melissa
2012-12-01
This response to Leah A. Bricker and Phillip Bell's paper, GodMode is his video game name, examines their assertion that the social nexus of gaming practices is an important factor to consider for those looking to design STEM video games. I propose that we need to go beyond the investigation into which aspects of games play a role in learning, and move on to thinking about how these insights can actually inform game design practice.
ERIC Educational Resources Information Center
Biles, Melissa
2012-01-01
This response to Leah A. Bricker and Phillip Bell's paper, "GodMode is his video game name", examines their assertion that the social nexus of gaming practices is an important factor to consider for those looking to design STEM video games. I propose that we need to go beyond the investigation into which aspects of games play a role in learning,…
Development and Testing of Mechanism Technology for Space Exploration in Extreme Environments
NASA Technical Reports Server (NTRS)
Tyler, Tony R.; Levanas, Greg; Mojarradi, Mohammad M.; Abel, Phillip B.
2011-01-01
The NASA Jet Propulsion Lab (JPL), Glenn Research Center (GRC), Langley Research Center (LaRC), and Aeroflex, Inc. have partnered to develop and test actuator hardware that will survive the stringent environment of the moon, and which can also be leveraged for other challenging space exploration missions. Prototype actuators have been built and tested in a unique low temperature test bed with motor interface temperatures as low as 14 degrees Kelvin. Several years of work have resulted in specialized electro-mechanical hardware to survive extreme space exploration environments, a test program that verifies and finds limitations of the designs at extreme temperatures, and a growing knowledge base that can be leveraged by future space exploration missions.
13 CFR 108.1150 - Maximum amount of Leverage for a NMVC Company.
Code of Federal Regulations, 2010 CFR
2010-01-01
... NEW MARKETS VENTURE CAPITAL (âNMVCâ) PROGRAM SBA Financial Assistance for NMVC Companies (Leverage... percent of its Leverageable Capital. Conditional Commitments by SBA To Reserve Leverage for a NMVC Company ...
Leveraging Higher Education Consortia for Institutional Advancement
ERIC Educational Resources Information Center
Burley, Diana; Gnam, Cathy; Newman, Robin; Straker, Howard; Babies, Tanika
2012-01-01
Purpose: The purpose of this paper is to explore conceptually the role of higher education consortia in facilitating the operational advancement of member institutions, and in enabling their development as learning organizations in a changing and competitive higher education environment. Design/methodology/approach: This article synthesizes the…
Leveraging Collaborative, Thematic Problem-Based Learning to Integrate Curricula
ERIC Educational Resources Information Center
Sroufe, Robert; Ramos, Diane P.
2015-01-01
This study chronicles learning from faculty who designed and delivered collaborative, problem-based learning courses that anchor a one-year MBA emphasizing sustainability. While cultivating the application of learning across the curriculum, the authors engaged MBA students in solving complex, real-world sustainability challenges using a…
Learnable Interfaces--Leveraging Navigation by Design
ERIC Educational Resources Information Center
Swanson, Kari Gunvaldson
2012-01-01
Complex productivity applications that integrate tasks in the workplace are becoming more common. Usability typically focuses on short-term, immediate measures of task performance. This study incorporates a long-term goal of more durable learning, focusing on implicit learning (spontaneous, unplanned, usually unconscious learning as a result of…
Low-cost space-varying FIR filter architecture for computational imaging systems
NASA Astrophysics Data System (ADS)
Feng, Guotong; Shoaib, Mohammed; Schwartz, Edward L.; Dirk Robinson, M.
2010-01-01
Recent research demonstrates the advantage of designing electro-optical imaging systems by jointly optimizing the optical and digital subsystems. The optical systems designed using this joint approach intentionally introduce large and often space-varying optical aberrations that produce blurry optical images. Digital sharpening restores reduced contrast due to these intentional optical aberrations. Computational imaging systems designed in this fashion have several advantages including extended depth-of-field, lower system costs, and improved low-light performance. Currently, most consumer imaging systems lack the necessary computational resources to compensate for these optical systems with large aberrations in the digital processor. Hence, the exploitation of the advantages of the jointly designed computational imaging system requires low-complexity algorithms enabling space-varying sharpening. In this paper, we describe a low-cost algorithmic framework and associated hardware enabling the space-varying finite impulse response (FIR) sharpening required to restore largely aberrated optical images. Our framework leverages the space-varying properties of optical images formed using rotationally-symmetric optical lens elements. First, we describe an approach to leverage the rotational symmetry of the point spread function (PSF) about the optical axis allowing computational savings. Second, we employ a specially designed bank of sharpening filters tuned to the specific radial variation common to optical aberrations. We evaluate the computational efficiency and image quality achieved by using this low-cost space-varying FIR filter architecture.
NASA Astrophysics Data System (ADS)
Javahery, Homa; Deichman, Alexander; Seffah, Ahmed; Taleb, Mohamed
Patterns are a design tool to capture best practices, tackling problems that occur in different contexts. A user interface (UI) design pattern spans several levels of design abstraction ranging from high-level navigation to low-level idioms detailing a screen layout. One challenge is to combine a set of patterns to create a conceptual design that reflects user experiences. In this chapter, we detail a user-centered design (UCD) framework that exploits the novel idea of using personas and patterns together. Personas are used initially to collect and model user experiences. UI patterns are selected based on personas pecifications; these patterns are then used as building blocks for constructing conceptual designs. Through the use of a case study, we illustrate how personas and patterns can act as complementary techniques in narrowing the gap between two major steps in UCD: capturing users and their experiences, and building an early design based on that information. As a result of lessons learned from the study and by refining our framework, we define a more systematic process called UX-P (User Experiences to Pattern), with a supporting tool. The process introduces intermediate analytical steps and supports designers in creating usable designs.
Anti-Cocaine Vaccine Based on Coupling a Cocaine Analog to a Disrupted Adenovirus
Koob, George; Hicks, Martin J.; Wee, Sunmee; Rosenberg, Jonathan B.; De, Bishnu P.; Kaminksy, Stephen M.; Moreno, Amira; Janda, Kim D.; Crystal, Ronald G.
2012-01-01
The challenge in developing an anti-cocaine vaccine is that cocaine is a small molecule, invisible to the immune system. Leveraging the knowledge that adenovirus (Ad) capsid proteins are highly immunogenic in humans, we hypothesized that linking a cocaine hapten to Ad capsid proteins would elicit high-affinity, high-titer antibodies against cocaine, sufficient to sequester systemically administered cocaine and prevent access to the brain, thus suppressing cocaine-induced behaviors. Based on these concepts, we developed dAd5GNE, a disrupted E1−E3− serotype 5 Ad with GNE, a stable cocaine analog, covalently linked to the Ad capsid proteins. In pre-clinical studies, dAd5GNE evoked persistent, high titer, high affinity IgG anti-cocaine antibodies, and was highly effective in blocking cocaine-induced hyperactivity and cocaine self-administration behavior in rats. Future studies will be designed to expand the efficacy studies, carry out relevant toxicology studies, and test dAd5GNE in human cocaine addicts. PMID:22229312
Transformational electronics are now reconfiguring
NASA Astrophysics Data System (ADS)
Rojas, Jhonathan P.; Hussain, Aftab M.; Arevalo, A.; Foulds, I. G.; Torres Sevilla, Galo A.; Nassar, Joanna M.; Hussain, Muhammad M.
2015-05-01
Current developments on enhancing our smart living experience are leveraging the increased interest for novel systems that can be compatible with foldable, wrinkled, wavy and complex geometries and surfaces, and thus become truly ubiquitous and easy to deploy. Therefore, relying on innovative structural designs we have been able to reconfigure the physical form of various materials, to achieve remarkable mechanical flexibility and stretchability, which provides us with the perfect platform to develop enhanced electronic systems for application in entertainment, healthcare, fitness and wellness, military and manufacturing industry. Based on these novel structural designs we have developed a siliconbased network of hexagonal islands connected through double-spiral springs, forming an ultra-stretchable (~1000%) array for full compliance to highly asymmetric shapes and surfaces, as well as a serpentine design used to show an ultrastretchable (~800%) and flexible, spatially reconfigurable, mobile, metallic thin film copper (Cu)-based, body-integrated and non-invasive thermal heater with wireless controlling capability, reusability, heating-adaptability and affordability due to low-cost complementary metal oxide semiconductor (CMOS)-compatible integration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, Karla
Although the high-performance computing (HPC) community increasingly embraces object-oriented programming (OOP), most HPC OOP projects employ the C++ programming language. Until recently, Fortran programmers interested in mining the benefits of OOP had to emulate OOP in Fortran 90/95. The advent of widespread compiler support for Fortran 2003 now facilitates explicitly constructing object-oriented class hierarchies via inheritance and leveraging related class behaviors such as dynamic polymorphism. Although C++ allows a class to inherit from multiple parent classes, Fortran and several other OOP languages restrict or prohibit explicit multiple inheritance relationships in order to circumvent several pitfalls associated with them. Nonetheless, whatmore » appears as an intrinsic feature in one language can be modeled as a user-constructed design pattern in another language. The present paper demonstrates how to apply the facade structural design pattern to support a multiple inheritance class relationship in Fortran 2003. As a result, the design unleashes the power of the associated class relationships for modeling complicated data structures yet avoids the ambiguities that plague some multiple inheritance scenarios.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jaing, C; Gardner, S
The goal of this project is to develop forensic genotyping assays for select agent viruses, enhancing the current capabilities for the viral bioforensics and law enforcement community. We used a multipronged approach combining bioinformatics analysis, PCR-enriched samples, microarrays and TaqMan assays to develop high resolution and cost effective genotyping methods for strain level forensic discrimination of viruses. We have leveraged substantial experience and efficiency gained through year 1 on software development, SNP discovery, TaqMan signature design and phylogenetic signature mapping to scale up the development of forensics signatures in year 2. In this report, we have summarized the whole genomemore » wide SNP analysis and microarray probe design for forensics characterization of South American hemorrhagic fever viruses, tick-borne encephalitis viruses and henipaviruses, Old World Arenaviruses, filoviruses, Crimean-Congo hemorrhagic fever virus, Rift Valley fever virus and Japanese encephalitis virus.« less
NASA Technical Reports Server (NTRS)
Rector, Tony; Peyton, Barbara M.; Steele, John W.; Makinen, Janice; Bue, Grant C.; Campbell, Colin
2014-01-01
Water loop maintenance components to maintain the water quality of the Advanced Spacesuit Water Membrane Evaporation (SWME) water recirculation loop have undergone a comparative performance evaluation with a second SWME water recirculation loop with no water quality maintenance. Results show the benefits of periodic water maintenance. The SWME is a heat rejection device under development at the NASA Johnson Space Center to perform thermal control for advanced spacesuits. One advantage to this technology is the potential for a significantly greater degree of tolerance to contamination when compared to the existing Sublimator technology. The driver for the evaluation of water recirculation maintenance components was to further enhance this advantage through the leveraging of fluid loop management lessons learned from the International Space Station (ISS). A bed design that was developed for a UTAS military application, and considered for a potential ISS application with the Urine Processor Assembly, provided a low pressure drop means for water maintenance in a recirculation loop. The bed design is coupled with high capacity ion exchange resins, organic adsorbents, and a cyclic methodology developed for the Extravehicular Mobility Unit (EMU) Transport Water loop. The maintenance cycle included the use of a biocide delivery component developed for ISS to introduce a biocide in a microgravity compatible manner for the Internal Active Thermal Control System (IATCS). The leveraging of these water maintenance technologies to the SWME recirculation loop is a unique demonstration of applying the valuable lessons learned on the ISS to the next generation of manned spaceflight Environmental Control and Life Support System (ECLSS) hardware.
NASA Technical Reports Server (NTRS)
Rector, Tony; Peyton, Barbara M.; Steele, John W.; Makinen, Janice; Bue, Grant C.; Campbell, Colin
2014-01-01
Water loop maintenance components to maintain the water quality of the Advanced Spacesuit Water Membrane Evaporation (SWME) water recirculation loop have undergone a comparative performance evaluation with a recirculating control loop which had no water quality maintenance. Results show that periodic water maintenance can improve performance of the SWME. The SWME is a heat rejection device under development at the NASA Johnson Space Center to perform thermal control for advanced spacesuits. One advantage of this technology is the potential for a significantly greater degree of tolerance to contamination when compared to the existing sublimator technology. The driver for the evaluation of water recirculation maintenance components was to enhance the robustness of the SWME through the leveraging of fluid loop management lessons learned from the International Space Station (ISS). A patented bed design that was developed for a United Technologies Aerospace System military application provided a low pressure drop means for water maintenance in the SWME recirculation loop. The bed design is coupled with high capacity ion exchange resins, organic adsorbents, and a cyclic methodology developed for the Extravehicular Mobility Unit (EMU) Transport Water loop. The maintenance cycle included the use of a biocide delivery component developed for the ISS to introduce a biocide in a microgravity compatible manner for the Internal Active Thermal Control System (IATCS). The leveraging of these water maintenance technologies to the SWME recirculation loop is a unique demonstration of applying the valuable lessons learned on the ISS to the next generation of manned spaceflight Environmental Control and Life Support System (ECLSS) hardware.
NASA Technical Reports Server (NTRS)
Rector, Tony; Peyton, Barbara; Steele, John W.; Bue, Grant C.; Campbell, Colin; Makinen, Janice
2014-01-01
Water loop maintenance components to maintain the water quality of the Advanced Spacesuit Water Membrane Evaporation (SWME) water recirculation loop have undergone a comparative performance evaluation with a second SWME water recirculation loop with no water quality maintenance. Results show the benefits of periodic water maintenance. The SWME is a heat rejection device under development at the NASA Johnson Space Center to perform thermal control for advanced spacesuits. One advantage to this technology is the potential for a significantly greater degree of tolerance to contamination when compared to the existing Sublimator technology. The driver for the evaluation of water recirculation maintenance components was to further enhance this advantage through the leveraging of fluid loop management lessonslearned from the International Space Station (ISS). A bed design that was developed for a UTAS military application, and considered for a potential ISS application with the Urine Processor Assembly, provided a low pressure drop means for water maintenance in a recirculation loop. The bed design is coupled with high capacity ion exchange resins, organic adsorbents, and a cyclic methodology developed for the Extravehicular Mobility Unit (EMU) Transport Water loop. The maintenance cycle included the use of a biocide delivery component developed for ISS to introduce a biocide in a microgravity-compatible manner for the Internal Active Thermal Control System (IATCS). The leveraging of these water maintenance technologies to the SWME recirculation loop is a unique demonstration of applying the valuable lessons learned on the ISS to the next generation of manned spaceflight Environmental Control and Life Support System (ECLSS) hardware.
NASA Technical Reports Server (NTRS)
Schneider, Steven J.
1997-01-01
NASA Lewis Research Center's On-Board Propulsion program (OBP) is developing low-thrust chemical propulsion technologies for both satellite and vehicle reaction control applications. There is a vigorous international competition to develop new, highperformance bipropellant engines. High-leverage bipropellant systems are critical to both commercial competitiveness in the international communications market and to cost-effective mission design in government sectors. To significantly improve bipropellant engine performance, we must increase the thermal margin of the chamber materials. Iridium-coated rhenium (Ir/Re) engines, developed and demonstrated under OBP programs, can operate at temperatures well above the constraints of state-of-practice systems, providing a sufficient margin to maximize performance with the hypergolic propellants used in most satellite propulsion systems.
Combinatorial Nano-Bio Interfaces.
Cai, Pingqiang; Zhang, Xiaoqian; Wang, Ming; Wu, Yun-Long; Chen, Xiaodong
2018-06-08
Nano-bio interfaces are emerging from the convergence of engineered nanomaterials and biological entities. Despite rapid growth, clinical translation of biomedical nanomaterials is heavily compromised by the lack of comprehensive understanding of biophysicochemical interactions at nano-bio interfaces. In the past decade, a few investigations have adopted a combinatorial approach toward decoding nano-bio interfaces. Combinatorial nano-bio interfaces comprise the design of nanocombinatorial libraries and high-throughput bioevaluation. In this Perspective, we address challenges in combinatorial nano-bio interfaces and call for multiparametric nanocombinatorics (composition, morphology, mechanics, surface chemistry), multiscale bioevaluation (biomolecules, organelles, cells, tissues/organs), and the recruitment of computational modeling and artificial intelligence. Leveraging combinatorial nano-bio interfaces will shed light on precision nanomedicine and its potential applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu, Pengchen; Settgast, Randolph R.; Johnson, Scott M.
2014-12-17
GEOS is a massively parallel, multi-physics simulation application utilizing high performance computing (HPC) to address subsurface reservoir stimulation activities with the goal of optimizing current operations and evaluating innovative stimulation methods. GEOS enables coupling of di erent solvers associated with the various physical processes occurring during reservoir stimulation in unique and sophisticated ways, adapted to various geologic settings, materials and stimulation methods. Developed at the Lawrence Livermore National Laboratory (LLNL) as a part of a Laboratory-Directed Research and Development (LDRD) Strategic Initiative (SI) project, GEOS represents the culmination of a multi-year ongoing code development and improvement e ort that hasmore » leveraged existing code capabilities and sta expertise to design new computational geosciences software.« less
Pneumatic Variable Series Elastic Actuator.
Zheng, Hao; Wu, Molei; Shen, Xiangrong
2016-08-01
Inspired by human motor control theory, stiffness control is highly effective in manipulation and human-interactive tasks. The implementation of stiffness control in robotic systems, however, has largely been limited to closed-loop control, and suffers from multiple issues such as limited frequency range, potential instability, and lack of contribution to energy efficiency. Variable-stiffness actuator represents a better solution, but the current designs are complex, heavy, and bulky. The approach in this paper seeks to address these issues by using pneumatic actuator as a variable series elastic actuator (VSEA), leveraging the compressibility of the working fluid. In this work, a pneumatic actuator is modeled as an elastic element with controllable stiffness and equilibrium point, both of which are functions of air masses in the two chambers. As such, for the implementation of stiffness control in a robotic system, the desired stiffness/equilibrium point can be converted to the desired chamber air masses, and a predictive pressure control approach is developed to control the timing of valve switching to obtain the desired air mass while minimizing control action. Experimental results showed that the new approach in this paper requires less expensive hardware (on-off valve instead of proportional valve), causes less control action in implementation, and provides good control performance by leveraging the inherent dynamics of the actuator.
Pneumatic Variable Series Elastic Actuator
Zheng, Hao; Wu, Molei; Shen, Xiangrong
2016-01-01
Inspired by human motor control theory, stiffness control is highly effective in manipulation and human-interactive tasks. The implementation of stiffness control in robotic systems, however, has largely been limited to closed-loop control, and suffers from multiple issues such as limited frequency range, potential instability, and lack of contribution to energy efficiency. Variable-stiffness actuator represents a better solution, but the current designs are complex, heavy, and bulky. The approach in this paper seeks to address these issues by using pneumatic actuator as a variable series elastic actuator (VSEA), leveraging the compressibility of the working fluid. In this work, a pneumatic actuator is modeled as an elastic element with controllable stiffness and equilibrium point, both of which are functions of air masses in the two chambers. As such, for the implementation of stiffness control in a robotic system, the desired stiffness/equilibrium point can be converted to the desired chamber air masses, and a predictive pressure control approach is developed to control the timing of valve switching to obtain the desired air mass while minimizing control action. Experimental results showed that the new approach in this paper requires less expensive hardware (on–off valve instead of proportional valve), causes less control action in implementation, and provides good control performance by leveraging the inherent dynamics of the actuator. PMID:27354755
High Throughput Transcriptomics: From screening to pathways
The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...
Learning and Design with Online Real-Time Collaboration
ERIC Educational Resources Information Center
Stevenson, Michael; Hedberg, John G.
2013-01-01
This paper explores the use of emerging Cloud technologies that support real-time online collaboration. It considers the extent to which these technologies can be leveraged to develop complex skillsets supporting interaction between multiple learners in online spaces. In a pilot study that closely examines how groups of learners translate two…
Embodied Perspective Taking in Learning about Complex Systems
ERIC Educational Resources Information Center
Soylu, Firat; Holbert, Nathan; Brady, Corey; Wilensky, Uri
2017-01-01
In this paper we present a learning design approach that leverages perspective-taking to help students learn about complex systems. We define perspective-taking as projecting one's identity onto external entities (both animate and inanimate) in an effort to predict and anticipate events based on ecological cues, to automatically sense the…
Universities and Libraries Move to the Mobile Web
ERIC Educational Resources Information Center
Aldrich, Alan W.
2010-01-01
The convergence of web-enabled smartphones, the applications designed for smartphone interfaces, and cloud computing is rapidly changing how people interact with each other and with their environments. The commercial sector has taken the lead in creating mobile websites that leverage the capacities of smartphones, and the academic community has…
Leveraging Educational Technology to Overcome Social Obstacles to Help Seeking
ERIC Educational Resources Information Center
Howley, Iris
2015-01-01
This dissertation provides initial empirical evidence for Expectancy Value Theory for Help Sources and generates design recommendations for online courses based on the newfound understanding between theory and student behavior. (Abstract shortened by UMI.). [The dissertation citations contained here are published with the permission of ProQuest…
Research Cluster Development at a Predominantly Undergraduate Institution
ERIC Educational Resources Information Center
Langley-Turnbaugh, S. J.; Shehata, T.
2015-01-01
The University of Southern Maine (USM) designed and implemented an internal Research Cluster Seed Fund competition with the goals of building USM faculty expertise to address industry and community needs, deepening the impact of research through an interdisciplinary approach to solving problems, and leveraging external funding to sustain…
OpenICE medical device interoperability platform overview and requirement analysis.
Arney, David; Plourde, Jeffrey; Goldman, Julian M
2018-02-23
We give an overview of OpenICE, an open source implementation of the ASTM standard F2761 for the Integrated Clinical Environment (ICE) that leverages medical device interoperability, together with an analysis of the clinical and non-functional requirements and community process that inspired its design.
Tacit Knowledge Barriers in Franchising: Practical Solutions
ERIC Educational Resources Information Center
Cumberland, Denise; Githens, Rod
2012-01-01
Purpose: The purpose of this paper is to identify barriers that hinder tacit knowledge transfer in a franchise environment and offer a compendium of solutions that encourage franchisees and franchisors to leverage tacit knowledge as a resource for competitive advantage. Design/methodology/approach: Drawing from the research on franchise…
Open Crowdsourcing: Leveraging Community Software Developers for IT Projects
ERIC Educational Resources Information Center
Phair, Derek
2012-01-01
This qualitative exploratory single-case study was designed to examine and understand the use of volunteer community participants as software developers and other project related roles, such as testers, in completing a web-based application project by a non-profit organization. This study analyzed the strategic decision to engage crowd…
Talking about Texts: Middle School Students' Engagement in Metalinguistic Talk
ERIC Educational Resources Information Center
D'warte, Jacqueline
2012-01-01
In this paper, discourse analytical methods are applied to data from two middle school classrooms, as a teacher, researcher, and students' engage in research based curricula (Martinez, Orellana, Pacheco, & Carbone, 2008; Orellana & Reynolds, 2008) designed to leverage students' language brokering skills and facilitate discussion about languages.…
Leveraging Knowledge: Impact on Low Cost Planetary Mission Design.
ERIC Educational Resources Information Center
Momjian, Jennifer
This paper discusses innovations developed by the Jet Propulsion Laboratory (JPL) librarians to reduce the information query cycle time for teams planning low-cost, planetary missions. The first section provides background on JPL and its library. The second section addresses the virtual information environment, including issues of access, content,…
A Conceptual Framework for Examining Knowledge Management in Higher Education Contexts
ERIC Educational Resources Information Center
Lee, Hae-Young; Roth, Gene L.
2009-01-01
Knowledge management is an on-going process that involves varied activities: diagnosis, design, and implementation of knowledge creation, knowledge transfer, and knowledge sharing. The primary goal of knowledge management, like other management theories or models, is to identify and leverage organizational and individual knowledge for the…
Fiscal Challenge: An Experiential Exercise in Policy Making
ERIC Educational Resources Information Center
Aguilar, Mike; Soques, Daniel
2015-01-01
In this article, the authors introduce a pedagogical innovation that is designed to enhance students' understanding of fiscal policy in general, and the national debt and deficit in particular. The innovation leverages the educational advantages offered through a competitive environment by pitting teams of students against one another with the…
Does Student Attrition Explain KIPP's Success?
ERIC Educational Resources Information Center
Nichols-Barrer, Ira; Gill, Brian P.; Gleason, Philip; Tuttle, Christina Clark
2014-01-01
The Knowledge Is Power Program (KIPP) is a network of charter schools designed to improve the educational opportunities available to low-income families. KIPP schools seek to boost their students' academic achievement and ultimately prepare them to enroll and succeed in college. To achieve these objectives, KIPP schools leverage strong…
Leveraging ARRA Funding for Developing Comprehensive State Longitudinal Data Systems
ERIC Educational Resources Information Center
Pfeiffer, Jay; Klein, Steven; Levesque, Karen
2009-01-01
The American Recovery and Reinvestment Act (ARRA) provides several funding opportunities that can assist states in designing, developing, and implementing statewide education longitudinal data systems. These new and enhanced information systems will enable states to track student progress within and across the secondary and postsecondary education…
Network-Cognizant Voltage Droop Control for Distribution Grids
Baker, Kyri; Bernstein, Andrey; Dall'Anese, Emiliano; ...
2017-08-07
Our paper examines distribution systems with a high integration of distributed energy resources (DERs) and addresses the design of local control methods for real-time voltage regulation. Particularly, the paper focuses on proportional control strategies where the active and reactive output-powers of DERs are adjusted in response to (and proportionally to) local changes in voltage levels. The design of the voltage-active power and voltage-reactive power characteristics leverages suitable linear approximation of the AC power-flow equations and is network-cognizant; that is, the coefficients of the controllers embed information on the location of the DERs and forecasted non-controllable loads/injections and, consequently, on themore » effect of DER power adjustments on the overall voltage profile. We pursued a robust approach to cope with uncertainty in the forecasted non-controllable loads/power injections. Stability of the proposed local controllers is analytically assessed and numerically corroborated.« less
Network Model-Assisted Inference from Respondent-Driven Sampling Data
Gile, Krista J.; Handcock, Mark S.
2015-01-01
Summary Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population. PMID:26640328
Network Model-Assisted Inference from Respondent-Driven Sampling Data.
Gile, Krista J; Handcock, Mark S
2015-06-01
Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population.
Network-Cognizant Voltage Droop Control for Distribution Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Kyri; Bernstein, Andrey; Dall'Anese, Emiliano
Our paper examines distribution systems with a high integration of distributed energy resources (DERs) and addresses the design of local control methods for real-time voltage regulation. Particularly, the paper focuses on proportional control strategies where the active and reactive output-powers of DERs are adjusted in response to (and proportionally to) local changes in voltage levels. The design of the voltage-active power and voltage-reactive power characteristics leverages suitable linear approximation of the AC power-flow equations and is network-cognizant; that is, the coefficients of the controllers embed information on the location of the DERs and forecasted non-controllable loads/injections and, consequently, on themore » effect of DER power adjustments on the overall voltage profile. We pursued a robust approach to cope with uncertainty in the forecasted non-controllable loads/power injections. Stability of the proposed local controllers is analytically assessed and numerically corroborated.« less
Chery, Joyce G; Sass, Chodon; Specht, Chelsea D
2017-09-01
We developed a bioinformatic pipeline that leverages a publicly available genome and published transcriptomes to design primers in conserved coding sequences flanking targeted introns of single-copy nuclear loci. Paullinieae (Sapindaceae) is used to demonstrate the pipeline. Transcriptome reads phylogenetically closer to the lineage of interest are aligned to the closest genome. Single-nucleotide polymorphisms are called, generating a "pseudoreference" closer to the lineage of interest. Several filters are applied to meet the criteria of single-copy nuclear loci with introns of a desired size. Primers are designed in conserved coding sequences flanking introns. Using this pipeline, we developed nine single-copy nuclear intron markers for Paullinieae. This pipeline is highly flexible and can be used for any group with available genomic and transcriptomic resources. This pipeline led to the development of nine variable markers for phylogenetic study without generating sequence data de novo.
Understanding Overbidding: Using the Neural Circuitry of Reward to Design Economic Auctions
Delgado, Mauricio R.; Schotter, Andrew; Ozbay, Erkut Y.; Phelps, Elizabeth A.
2011-01-01
We take advantage of our knowledge of the neural circuitry of reward to investigate a puzzling economic phenomenon: Why do people overbid in auctions? Using functional magnetic resonance imaging (fMRI), we observed that the social competition inherent in an auction results in a more pronounced blood oxygen level–dependent (BOLD) response to loss in the striatum, with greater overbidding correlated with the magnitude of this response. Leveraging these neuroimaging results, we design a behavioral experiment that demonstrates that framing an experimental auction to emphasize loss increases overbidding. These results highlight a role for the contemplation of loss in understanding the tendency to bid “too high.” Current economic theories suggest overbidding may result from either “joy of winning” or risk aversion. By combining neuroeconomic and behavioral economic techniques, we find that another factor, namely loss contemplation in a social context, may mediate overbidding in auctions. PMID:18818362
Understanding overbidding: using the neural circuitry of reward to design economic auctions.
Delgado, Mauricio R; Schotter, Andrew; Ozbay, Erkut Y; Phelps, Elizabeth A
2008-09-26
We take advantage of our knowledge of the neural circuitry of reward to investigate a puzzling economic phenomenon: Why do people overbid in auctions? Using functional magnetic resonance imaging (fMRI), we observed that the social competition inherent in an auction results in a more pronounced blood oxygen level-dependent (BOLD) response to loss in the striatum, with greater overbidding correlated with the magnitude of this response. Leveraging these neuroimaging results, we design a behavioral experiment that demonstrates that framing an experimental auction to emphasize loss increases overbidding. These results highlight a role for the contemplation of loss in understanding the tendency to bid "too high." Current economic theories suggest overbidding may result from either "joy of winning" or risk aversion. By combining neuroeconomic and behavioral economic techniques, we find that another factor, namely loss contemplation in a social context, may mediate overbidding in auctions.
Army Net Zero Prove Out. Integrated Net Zero Best Practices
2014-11-18
leveraged to increase awareness. Public Service Announcements developed by the Army or other installations can be leveraged. Signage for...approaches can be leveraged to increase awareness. Public Service Announcements developed by the Army or other installations can be leveraged. Signage
NASA Astrophysics Data System (ADS)
Marzari, Nicola
The last 30 years have seen the steady and exhilarating development of powerful quantum-simulation engines for extended systems, dedicated to the solution of the Kohn-Sham equations of density-functional theory, often augmented by density-functional perturbation theory, many-body perturbation theory, time-dependent density-functional theory, dynamical mean-field theory, and quantum Monte Carlo. Their implementation on massively parallel architectures, now leveraging also GPUs and accelerators, has started a massive effort in the prediction from first principles of many or of complex materials properties, leading the way to the exascale through the combination of HPC (high-performance computing) and HTC (high-throughput computing). Challenges and opportunities abound: complementing hardware and software investments and design; developing the materials' informatics infrastructure needed to encode knowledge into complex protocols and workflows of calculations; managing and curating data; resisting the complacency that we have already reached the predictive accuracy needed for materials design, or a robust level of verification of the different quantum engines. In this talk I will provide an overview of these challenges, with the ultimate prize being the computational understanding, prediction, and design of properties and performance for novel or complex materials and devices.
Compact sources for eyesafe illumination
NASA Astrophysics Data System (ADS)
Baranova, Nadia; Pu, Rui; Stebbins, Kenneth; Bystryak, Ilya; Rayno, Michael; Ezzo, Kevin; DePriest, Christopher
2018-02-01
Q-peak has demonstrated a compact, pulsed eyesafe laser architecture operating with >10 mJ pulse energies at repetition rates as high as 160 Hz. The design leverages an end-pumped solid-state laser geometry to produce adequate eyesafe beam quality (M2˜4), while also providing a path toward higher-density laser architectures for pulsed eyesafe applications. The baseline discussed in this paper has shown a unique capability for high-pulse repetition rates in a compact package, and offers additional potential for power scaling based on birefringence compensation. The laser consists of an actively Q-switched oscillator cavity producing pulse widths <30 ns, and utilizing an end-pumped Nd:YAG gain medium with a rubidium titanyl phosphate electro-optical crystal. The oscillator provides an effective front-end-seed for an optical parametric oscillator (OPO), which utilizes potassium titanyl arsenate in a linear OPO geometry. This laser efficiently operates in the eyesafe band, and has been designed to fit within a volume of 3760 cm3. We will discuss details of the optical system design, modeled thermal effects and stress-induced birefringence, as well as experimental advantages of the end-pumped laser geometry, along with proposed paths to higher eyesafe pulse energies.
Compact sources for eyesafe illumination
NASA Astrophysics Data System (ADS)
Baranova, N.; Pu, R.; Stebbins, K.; Bystryak, I.; Rayno, M.; Ezzo, K.; DePriest, C.
2017-02-01
Q-Peak has demonstrated a novel, compact, pulsed eyesafe laser architecture operating with <10 mJ pulse energies at repetition rates as high as 160 Hz. The design leverages an end-pumped solid-state laser geometry to produce adequate eyesafe beam quality (M2 4), while also providing a path towards higher-density laser architectures for pulsed eyesafe applications. The baseline discussed in this paper has shown a unique capability for high pulse repetition rates in a compact package, and offers additional potential for power scaling based on birefringence compensation. The laser consists of an actively Q-switched oscillator cavity producing pulse-widths <30 ns, and utilizing an end-pumped Nd: YAG gain medium with a Rubidium Titanyl Phosphate (RTP) electro-optical crystal. The oscillator provides an effective front-end-seed for an optical parametric oscillator (OPO), which utilizes Potassium Titanyl Arsenate (KTA) in a linear OPO geometry. This laser efficiently operates in the eyesafe band, and has been designed to fit within a volume of 3760 cm3. We will discuss details of the optical system design, modeled thermal effects and stress-induced birefringence, as well as experimental advantages of the end-pumped laser geometry, along with proposed paths to higher eyesafe pulse energies.
Information Age Transformation: Getting to a 21st Century Military (revised)
2002-06-01
strategy for transformation is built around experimentation with network- centric concepts designed to leverage the power of Information Age technologies and...Edward A. Smith: From Network- Centric to Effects-Based Operations. 7 CHAPTER 2 Background and Purpose DoD is fully committed to taking advantage...Network Centric Warfare3 (NCW) translates these broad vision statements into a way ahead. NCW is a set of warfighting concepts4 designed to create and
Barnett, Miya L; Lau, Anna S; Miranda, Jeanne
2018-05-07
Mobilizing lay health workers (LHWs) to deliver evidence-based treatments (EBTs) is a workforce strategy to address mental health disparities in underserved communities. LHWs can be leveraged to support access to EBTs in a variety of ways, from conducting outreach for EBTs delivered by professional providers to serving as the primary treatment providers. This critical review provides an overview of how LHW-supported or -delivered EBTs have been leveraged in low-, middle-, and high-income countries (HICs). We propose a conceptual model for LHWs to address drivers of service disparities, which relate to the overall supply of the EBTs provided and the demand for these treatments. The review provides illustrative case examples that demonstrate how LHWs have been leveraged globally and domestically to increase access to mental health services. It also discusses challenges and recommendations regarding implementing LHW-supported or -delivered EBTs.
A Survey Of Techniques for Managing and Leveraging Caches in GPUs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh
2014-09-01
Initially introduced as special-purpose accelerators for graphics applications, graphics processing units (GPUs) have now emerged as general purpose computing platforms for a wide range of applications. To address the requirements of these applications, modern GPUs include sizable hardware-managed caches. However, several factors, such as unique architecture of GPU, rise of CPU–GPU heterogeneous computing, etc., demand effective management of caches to achieve high performance and energy efficiency. Recently, several techniques have been proposed for this purpose. In this paper, we survey several architectural and system-level techniques proposed for managing and leveraging GPU caches. We also discuss the importance and challenges ofmore » cache management in GPUs. The aim of this paper is to provide the readers insights into cache management techniques for GPUs and motivate them to propose even better techniques for leveraging the full potential of caches in the GPUs of tomorrow.« less
Segar, Michelle; Taber, Jennifer M; Patrick, Heather; Thai, Chan L; Oh, April
2017-05-18
Communication about physical activity (PA) frames PA and influences what it means to people, including the role it plays in their lives. To the extent that PA messages can be designed to reflect outcomes that are relevant to what people most value experiencing and achieving in their daily lives, the more compelling and effective they will be. Aligned with self-determination theory, this study investigated proximal goals and values that are salient in everyday life and how they could be leveraged through new messaging to better support PA participation among women. The present study was designed to examine the nature of women's daily goals and priorities and investigate women's PA beliefs, feelings, and experiences, in order to identify how PA may compete with or facilitate women's daily goals and priorities. Preliminary recommendations are proposed for designing new PA messages that align PA with women's daily goals and desired experiences to better motivate participation. Eight focus groups were conducted with White, Black, and Hispanic/Latina women aged 22-49, stratified by amount of self-reported PA (29 low active participants, 11 high active participants). Respondents discussed their goals, values, and daily priorities along with beliefs, feelings about and experiences being physically active. Data were collected, coded, and analyzed using a thematic analysis strategy to identify emergent themes. Many of the goals and values that both low and high active participants discussed as desiring and valuing map on to key principles of self-determination theory. However, the discussions among low active participants suggested that their beliefs, feelings, experiences, and definitions of PA were in conflict with their proximal goals, values, and priorities, also undermining their psychological needs for autonomy, competence, and relatedness. Findings from this study can be used to inform and evaluate new physical activity communication strategies that leverage more proximal goals, values, and experiences of happiness and success to better motivate PA among ethnically diverse low active women. Specifically, this research suggests a need to address how women's daily goals and desired experiences may undermine PA participation, in addition to framing PA as facilitating rather than competing with their daily priorities and desired leisure-time experiences.
NASA Astrophysics Data System (ADS)
Pacheco-Guffrey, H. A.
2016-12-01
Classroom teachers face many challenges today such as new standards, the moving targets of high stakes tests and teacher evaluations, inconsistent/insufficient access to resources and evolving education policies. Science education in the K-5 context is even more complex. NGSS can be intimidating, especially to K-5 educators with little science background. High stakes science tests are slow to catch up with newly drafted state level science standards, leaving teachers unsure about what to change and when to implement updated standards. Amid all this change, many schools are also piloting new technology programs. Though exciting, tech initiatives can also be overwhelming to teachers who are already overburdened. A practical way to support teachers in science while remaining mindful of these stressors is to design and share resources that leverage other K-5 school initiatives. This is often done by integrating writing or math into science learning to meet Common Core requirements. This presentation will suggest a method for bringing Earth and space science learning into elementary / early childhood classrooms by utilizing the current push for tablet technology. The goal is to make science integration reasonable by linking it to technology programs that are in their early stages. The roles and uses of K-5 Earth and space science apps will be examined in this presentation. These apps will be linked to NGSS standards as well as to the science and engineering practices. To complement the app resources, two support frameworks will also be shared. They are designed to help educators consider new technologies in the context of their own classrooms and lessons. The SAMR Model (Puentadura, 2012) is a conceptual framework that helps teachers think critically about the means and purposes of integrating technology into existing lessons. A practical framework created by the author will also be shared. It is designed to help teachers identify and address the important logistical and curricular decision-making aspects of integrating technology into K-5 classroom science. This method provides clear applications for new technology while also bringing meaningful Earth and space science learning into K-5 classrooms.
20180311 - High Throughput Transcriptomics: From screening to pathways (SOT 2018)
The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...
Soft Actuators for Small-Scale Robotics.
Hines, Lindsey; Petersen, Kirstin; Lum, Guo Zhan; Sitti, Metin
2017-04-01
This review comprises a detailed survey of ongoing methodologies for soft actuators, highlighting approaches suitable for nanometer- to centimeter-scale robotic applications. Soft robots present a special design challenge in that their actuation and sensing mechanisms are often highly integrated with the robot body and overall functionality. When less than a centimeter, they belong to an even more special subcategory of robots or devices, in that they often lack on-board power, sensing, computation, and control. Soft, active materials are particularly well suited for this task, with a wide range of stimulants and a number of impressive examples, demonstrating large deformations, high motion complexities, and varied multifunctionality. Recent research includes both the development of new materials and composites, as well as novel implementations leveraging the unique properties of soft materials. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Introduction to the Special Issue on Digital Signal Processing in Radio Astronomy
NASA Astrophysics Data System (ADS)
Price, D. C.; Kocz, J.; Bailes, M.; Greenhill, L. J.
2016-03-01
Advances in astronomy are intimately linked to advances in digital signal processing (DSP). This special issue is focused upon advances in DSP within radio astronomy. The trend within that community is to use off-the-shelf digital hardware where possible and leverage advances in high performance computing. In particular, graphics processing units (GPUs) and field programmable gate arrays (FPGAs) are being used in place of application-specific circuits (ASICs); high-speed Ethernet and Infiniband are being used for interconnect in place of custom backplanes. Further, to lower hurdles in digital engineering, communities have designed and released general-purpose FPGA-based DSP systems, such as the CASPER ROACH board, ASTRON Uniboard, and CSIRO Redback board. In this introductory paper, we give a brief historical overview, a summary of recent trends, and provide an outlook on future directions.
de Wit, Bianca; Badcock, Nicholas A.; Grootswagers, Tijl; Hardwick, Katherine; Teichmann, Lina; Wehrman, Jordan; Williams, Mark; Kaplan, David Michael
2017-01-01
Active research-driven approaches that successfully incorporate new technology are known to catalyze student learning. Yet achieving these objectives in neuroscience education is especially challenging due to the prohibitive costs and technical demands of research-grade equipment. Here we describe a method that circumvents these factors by leveraging consumer EEG-based neurogaming technology to create an affordable, scalable, and highly portable teaching laboratory for undergraduate courses in neuroscience. This laboratory is designed to give students hands-on research experience, consolidate their understanding of key neuroscience concepts, and provide a unique real-time window into the working brain. Survey results demonstrate that students found the lab sessions engaging. Students also reported the labs enhanced their knowledge about EEG, their course material, and neuroscience research in general. PMID:28690430
17 CFR 31.22 - Prohibited trading in leverage contracts.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 17 Commodity and Securities Exchanges 1 2011-04-01 2011-04-01 false Prohibited trading in leverage contracts. 31.22 Section 31.22 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION LEVERAGE TRANSACTIONS § 31.22 Prohibited trading in leverage contracts. No futures commission merchant or...
17 CFR 31.22 - Prohibited trading in leverage contracts.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 17 Commodity and Securities Exchanges 1 2012-04-01 2012-04-01 false Prohibited trading in leverage contracts. 31.22 Section 31.22 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION LEVERAGE TRANSACTIONS § 31.22 Prohibited trading in leverage contracts. No futures commission merchant or...
17 CFR 31.22 - Prohibited trading in leverage contracts.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 17 Commodity and Securities Exchanges 1 2013-04-01 2013-04-01 false Prohibited trading in leverage contracts. 31.22 Section 31.22 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION LEVERAGE TRANSACTIONS § 31.22 Prohibited trading in leverage contracts. No futures commission merchant or...
17 CFR 31.22 - Prohibited trading in leverage contracts.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 17 Commodity and Securities Exchanges 1 2014-04-01 2014-04-01 false Prohibited trading in leverage contracts. 31.22 Section 31.22 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION LEVERAGE TRANSACTIONS § 31.22 Prohibited trading in leverage contracts. No futures commission merchant or...
17 CFR 31.22 - Prohibited trading in leverage contracts.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Prohibited trading in leverage contracts. 31.22 Section 31.22 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION LEVERAGE TRANSACTIONS § 31.22 Prohibited trading in leverage contracts. No futures commission merchant or...
Software design by reusing architectures
NASA Technical Reports Server (NTRS)
Bhansali, Sanjay; Nii, H. Penny
1992-01-01
Abstraction fosters reuse by providing a class of artifacts that can be instantiated or customized to produce a set of artifacts meeting different specific requirements. It is proposed that significant leverage can be obtained by abstracting software system designs and the design process. The result of such an abstraction is a generic architecture and a set of knowledge-based, customization tools that can be used to instantiate the generic architecture. An approach for designing software systems based on the above idea are described. The approach is illustrated through an implemented example, and the advantages and limitations of the approach are discussed.
17 CFR 31.8 - Cover of leverage contracts.
Code of Federal Regulations, 2014 CFR
2014-04-01
... current market value of the commodity represented by each receipt. (ii) Warehouse receipts for gold bullion in the case of leverage contracts on bulk gold coins, bulk gold coins in the case of leverage contracts on gold bullion, silver bullion in the case of leverage contracts on bulk silver coins, bulk...
17 CFR 31.8 - Cover of leverage contracts.
Code of Federal Regulations, 2012 CFR
2012-04-01
... current market value of the commodity represented by each receipt. (ii) Warehouse receipts for gold bullion in the case of leverage contracts on bulk gold coins, bulk gold coins in the case of leverage contracts on gold bullion, silver bullion in the case of leverage contracts on bulk silver coins, bulk...
17 CFR 31.8 - Cover of leverage contracts.
Code of Federal Regulations, 2013 CFR
2013-04-01
... current market value of the commodity represented by each receipt. (ii) Warehouse receipts for gold bullion in the case of leverage contracts on bulk gold coins, bulk gold coins in the case of leverage contracts on gold bullion, silver bullion in the case of leverage contracts on bulk silver coins, bulk...
3D printing for the design and fabrication of polymer-based gradient scaffolds.
Bracaglia, Laura G; Smith, Brandon T; Watson, Emma; Arumugasaamy, Navein; Mikos, Antonios G; Fisher, John P
2017-07-01
To accurately mimic the native tissue environment, tissue engineered scaffolds often need to have a highly controlled and varied display of three-dimensional (3D) architecture and geometrical cues. Additive manufacturing in tissue engineering has made possible the development of complex scaffolds that mimic the native tissue architectures. As such, architectural details that were previously unattainable or irreproducible can now be incorporated in an ordered and organized approach, further advancing the structural and chemical cues delivered to cells interacting with the scaffold. This control over the environment has given engineers the ability to unlock cellular machinery that is highly dependent upon the intricate heterogeneous environment of native tissue. Recent research into the incorporation of physical and chemical gradients within scaffolds indicates that integrating these features improves the function of a tissue engineered construct. This review covers recent advances on techniques to incorporate gradients into polymer scaffolds through additive manufacturing and evaluate the success of these techniques. As covered here, to best replicate different tissue types, one must be cognizant of the vastly different types of manufacturing techniques available to create these gradient scaffolds. We review the various types of additive manufacturing techniques that can be leveraged to fabricate scaffolds with heterogeneous properties and discuss methods to successfully characterize them. Additive manufacturing techniques have given tissue engineers the ability to precisely recapitulate the native architecture present within tissue. In addition, these techniques can be leveraged to create scaffolds with both physical and chemical gradients. This work offers insight into several techniques that can be used to generate graded scaffolds, depending on the desired gradient. Furthermore, it outlines methods to determine if the designed gradient was achieved. This review will help to condense the abundance of information that has been published on the creation and characterization of gradient scaffolds and to provide a single review discussing both methods for manufacturing gradient scaffolds and evaluating the establishment of a gradient. Copyright © 2017. Published by Elsevier Ltd.
Stochastic Simulation Tool for Aerospace Structural Analysis
NASA Technical Reports Server (NTRS)
Knight, Norman F.; Moore, David F.
2006-01-01
Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.
Multidisciplinary Design, Analysis, and Optimization Tool Development using a Genetic Algorithm
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley
2008-01-01
Multidisciplinary design, analysis, and optimization using a genetic algorithm is being developed at the National Aeronautics and Space A dministration Dryden Flight Research Center to automate analysis and design process by leveraging existing tools such as NASTRAN, ZAERO a nd CFD codes to enable true multidisciplinary optimization in the pr eliminary design stage of subsonic, transonic, supersonic, and hypers onic aircraft. This is a promising technology, but faces many challe nges in large-scale, real-world application. This paper describes cur rent approaches, recent results, and challenges for MDAO as demonstr ated by our experience with the Ikhana fire pod design.
Mars Ascent Vehicle Needs Technology Development with a Focus on High Propellant Fractions
NASA Astrophysics Data System (ADS)
Whitehead, J. C.
2018-04-01
Launching from Mars to orbit requires a miniature launch vehicle, beyond any known spacecraft propulsion. The Mars Ascent Vehicle (MAV) needs an unusually high propellant mass fraction. MAV mass has high leverage for the cost of Mars Sample Return.
Motivation for DOC III: 64-bit digital optical computer
NASA Astrophysics Data System (ADS)
Guilfoyle, Peter S.
1991-09-01
OptiComp has focused on a digital optical logic family in order to capitalize on the inherent benefits of optical computing, which include (1) high FAN-IN and FAN-OUT, (2) low power consumption, (3) high noise margin, (4) high algorithmic efficiency using 'smart' interconnects, and (5) free-space leverage of gate interconnect bandwidth product. Other well-known secondary advantages of optical logic include zero capacitive loading of signals at a detector, zero cross-talk between signals, zero signal dispersion, and minimal clock skew (a few picoseconds or less in an imaging system). The primary focus of this paper is to demonstrate how each of the five advantages can be used to leverage other logic family performance such as GaAs; the secondary attributes are discussed only in the context of introducing the DOC III architecture.
FPGA Acceleration of the phylogenetic likelihood function for Bayesian MCMC inference methods.
Zierke, Stephanie; Bakos, Jason D
2010-04-12
Likelihood (ML)-based phylogenetic inference has become a popular method for estimating the evolutionary relationships among species based on genomic sequence data. This method is used in applications such as RAxML, GARLI, MrBayes, PAML, and PAUP. The Phylogenetic Likelihood Function (PLF) is an important kernel computation for this method. The PLF consists of a loop with no conditional behavior or dependencies between iterations. As such it contains a high potential for exploiting parallelism using micro-architectural techniques. In this paper, we describe a technique for mapping the PLF and supporting logic onto a Field Programmable Gate Array (FPGA)-based co-processor. By leveraging the FPGA's on-chip DSP modules and the high-bandwidth local memory attached to the FPGA, the resultant co-processor can accelerate ML-based methods and outperform state-of-the-art multi-core processors. We use the MrBayes 3 tool as a framework for designing our co-processor. For large datasets, we estimate that our accelerated MrBayes, if run on a current-generation FPGA, achieves a 10x speedup relative to software running on a state-of-the-art server-class microprocessor. The FPGA-based implementation achieves its performance by deeply pipelining the likelihood computations, performing multiple floating-point operations in parallel, and through a natural log approximation that is chosen specifically to leverage a deeply pipelined custom architecture. Heterogeneous computing, which combines general-purpose processors with special-purpose co-processors such as FPGAs and GPUs, is a promising approach for high-performance phylogeny inference as shown by the growing body of literature in this field. FPGAs in particular are well-suited for this task because of their low power consumption as compared to many-core processors and Graphics Processor Units (GPUs).
Learning to Leverage Student Thinking: What Novice Approximations Teach Us about Ambitious Practice
ERIC Educational Resources Information Center
Singer-Gabella, Marcy; Stengel, Barbara; Shahan, Emily; Kim, Min-Joung
2016-01-01
Central to ambitious teaching is a constellation of practices we have come to call "leveraging student thinking." In leveraging, teachers position students' understanding and reasoning as a central means to drive learning forward. While leveraging typically is described as a feature of mature practice, in this article we examine…
17 CFR 31.23 - Limited right to rescind first leverage contract.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Limited right to rescind first... COMMISSION LEVERAGE TRANSACTIONS § 31.23 Limited right to rescind first leverage contract. (a) A leverage... pursuant to the following provisions: (1) Such customer may be assessed actual price losses accruing to the...
Code of Federal Regulations, 2010 CFR
2010-01-01
... NMVC Company-application procedure, amount, and term. 108.1200 Section 108.1200 Business Credit and... Assistance for NMVC Companies (Leverage) Conditional Commitments by Sba to Reserve Leverage for A Nmvc Company § 108.1200 SBA's Leverage commitment to a NMVC Company—application procedure, amount, and term. (a...
Code of Federal Regulations, 2010 CFR
2010-04-01
... financial, cover and segregation requirements by leverage transaction merchants. 31.7 Section 31.7 Commodity... of minimum financial, cover and segregation requirements by leverage transaction merchants. (a) Each... required by § 31.8, or that the amount of leverage customer funds in segregation is less than is required...
Code of Federal Regulations, 2011 CFR
2011-04-01
... financial, cover and segregation requirements by leverage transaction merchants. 31.7 Section 31.7 Commodity... of minimum financial, cover and segregation requirements by leverage transaction merchants. (a) Each... required by § 31.8, or that the amount of leverage customer funds in segregation is less than is required...
ERIC Educational Resources Information Center
Bitz, Michael; Emejulu, Obiajulu
2016-01-01
This article is an international reflection on literacy, creativity, and student engagement. The authors collaborated to help Nigerian youths and their teachers develop, design, and share original comic books. By leveraging student engagement for literacy learning, the authors highlighted the crucial role of creativity in the classroom. The…
Ontologies for Effective Use of Context in E-Learning Settings
ERIC Educational Resources Information Center
Jovanovic, Jelena; Gasevic, Dragan; Knight, Colin; Richards, Griff
2007-01-01
This paper presents an ontology-based framework aimed at explicit representation of context-specific metadata derived from the actual usage of learning objects and learning designs. The core part of the proposed framework is a learning object context ontology, that leverages a range of other kinds of learning ontologies (e.g., user modeling…
Internal Social Media's Impact on Socialization and Commitment
ERIC Educational Resources Information Center
Gonzalez, Ester S.
2012-01-01
Social media technologies present an opportunity for organizations to create value by acclimating new employees and increasing organizational commitment. Past research has indicated that many organizations have leveraged social media in innovative ways. The purpose of this study is to investigate an internal social media tool that was designed and…
Cross-Organizational Knowledge Sharing: Information Reuse in Small Organizations
ERIC Educational Resources Information Center
White, Kevin Forsyth
2010-01-01
Despite the potential value of leveraging organizational memory and expertise, small organizations have been unable to capitalize on its promised value. Existing solutions have largely side-stepped the unique needs of these organizations, which are relegated to systems designed to take advantage of large pools of experts or to use Internet sources…
ERIC Educational Resources Information Center
Kohlbacher, Florian; Mukai, Kazuo
2007-01-01
Purpose: This paper aims to explain and analyze community-based corporate knowledge sharing and organizational learning, the actual use of communities in Hewlett Packard (HP) Consulting and Integration (CI) and their role in leveraging and exploiting existing and creating new knowledge. Design/methodology/approach: The paper presents an…
The Wikipedia Project: Changing Students from Consumers to Producers
ERIC Educational Resources Information Center
Sweeney, Meghan
2012-01-01
Whenever the author teaches English 102, a research-focused, second-semester composition course, at least one student asks her whether or not she "allows" Wikipedia. She then redesigned her course to leverage Wikipedia as a source of inquiry. In other words, she "allows" Wikipedia, but through the Wikipedia project, which is designed to address…
Leveraging Volunteers: An Experimental Evaluation of a Tutoring Program for Struggling Readers
ERIC Educational Resources Information Center
Jacob, Robin; Armstrong, Catherine; Bowden, A. Brooks; Pan, Yilin
2016-01-01
This study evaluates the impacts and costs of the Reading Partners program, which uses community volunteers to provide one-on-one tutoring to struggling readers in under-resourced elementary schools. The evaluation uses an experimental design. Students were randomly assigned within 19 different Reading Partners sites to a program or control…
Counter-Mapping the Neighborhood on Bicycles: Mobilizing Youth to Reimagine the City
ERIC Educational Resources Information Center
Taylor, Katie Headrick; Hall, Rogers
2013-01-01
Personal mobility is a mundane characteristic of daily life. However, mobility is rarely considered an opportunity for learning in the learning sciences, and is almost never leveraged as relevant, experiential material for teaching. This article describes a social design experiment for spatial justice that focused on changes in the personal…
Leveraging 21st Century Learning & Technology to Create Caring Diverse Classroom Cultures
ERIC Educational Resources Information Center
Tarbutton, Tanya
2018-01-01
Creating diverse caring classroom environments, for all students, using innovative technology, is the impetus of this article. Administrators and teachers in many states have worked to integrate 21st Century Learning Outcomes and Local Control and Accountability Plans (LCAP) into daily teaching and learning. These initiatives are designed to…
Ahead of the Curve: Implementation Challenges in Personalized Learning School Models
ERIC Educational Resources Information Center
Bingham, Andrea J.; Pane, John F.; Steiner, Elizabeth D.; Hamilton, Laura S.
2018-01-01
In the current educational context, school models that leverage technology to personalize instruction have proliferated, as has student enrollment in, and funding of, such school models. However, even the best laid plans are subject to challenges in design and practice, particularly in the dynamic context of a school. In this collective case…
ERIC Educational Resources Information Center
Hoffman, Daniel L.
2013-01-01
The purpose of the study is to better understand the role of physicality, interactivity, and interface effects in learning with digital content. Drawing on work in cognitive science, human-computer interaction, and multimedia learning, the study argues that interfaces that promote physical interaction can provide "conceptual leverage"…
Exploring and Leveraging Chinese International Students' Strengths for Success
ERIC Educational Resources Information Center
He, Ye; Hutson, Bryant
2018-01-01
This study used an Appreciative Education framework to explore the strengths of Chinese international students and to identify areas where support is needed during their transition to U.S. higher education settings. Using a convergent mixed methods design with data collected from surveys, interviews and focus groups, the complex nature of the…
Leveraging Digital Tools to Build Educative Curricula for Teachers: Two Promising Approaches
ERIC Educational Resources Information Center
Bates, Meg S.
2017-01-01
Well-designed curriculum materials include educative components that help teachers effectively plan, implement, and adapt activities for diverse learners. Digital materials offer several affordances over print materials in the format, fit, and flexibility of the educative information provided to teachers, as well as the ability of the materials to…
ERIC Educational Resources Information Center
Zelik, Daniel J.
2012-01-01
Cognitive Systems Engineering (CSE) has a history built, in part, on leveraging representational design to improve system performance. Traditionally, however, CSE has focused on visual representation of "monitored" processes--active, ongoing, and interconnected activities occurring in a system of interest and monitored by human…
Prospective Elementary Teachers Making Sense of Multidigit Multiplication: Leveraging Resources
ERIC Educational Resources Information Center
Whitacre, Ian; Nickerson, Susan D.
2016-01-01
This study examines how collective activity related to multiplication evolved over several class sessions in an elementary mathematics content course that was designed to foster prospective elementary teachers' number-sense development. We document how the class drew on as-if-shared ideas to make sense of multidigit multiplication in terms of…
ERIC Educational Resources Information Center
Gobert, Janice D.; Sao Pedro, Michael A.; Baker, Ryan S. J. D.; Toto, Ermal; Montalvo, Orlando
2012-01-01
We present "Science Assistments," an interactive environment, which assesses students' inquiry skills as they engage in inquiry using science microworlds. We frame our variables, tasks, assessments, and methods of analyzing data in terms of "evidence-centered design." Specifically, we focus on the "student model," the…
Accountable Game Design: Structuring the Dynamics of Student Learning Interactions
ERIC Educational Resources Information Center
Charoenying, Timothy
2010-01-01
Game-based classroom activity is intended to leverage students' interest and motivation to play, and to provide safe contexts for supporting students' academic learning. However, a basic criticism of many games currently used in classroom settings is that they can fail to meaningfully embody academic content. A more subtle concern is that…
2-micron Pulsed Direct Detection IPDA Lidar for Atmospheric CO2 Measurements
NASA Astrophysics Data System (ADS)
Yu, J.; Singh, U.; Petros, M.
2012-12-01
A 2-micron high energy, pulsed Integrated Path Differential Absorption (IPDA) lidar is being developed for atmospheric CO2 measurements. Development of this lidar heavily leverages the 2-micron laser technologies developed in LaRC over the last decade. The high pulse energy, direct detection lidar operating at CO2 2-micron absorption band provides an alternate approach to measure CO2 concentrations with significant advantages. It is expected to provide high-precision measurement capability by unambiguously eliminating contamination from aerosols and clouds that can bias the IPDA measurement. Our objective is to integrate an existing high energy double-pulsed 2-micron laser transmitter with a direct detection receiver and telescope to enable an airborne capability to perform a first proof of principle demonstration of airborne direct detection CO2 measurements. The 2-micron transmitter provides 100mJ at 10Hz with double pulse format specifically designed for DIAL/IPDA instrument. The compact, rugged, highly reliable transceiver is based on unique Ho:Tm:YLF high-energy 2-micron pulsed laser technology. All the optical mounts are custom designed and have space heritage. A 16-inch diameter telescope has been designed and being manufactured for the direct detection lidar. The detector is an InGaAs Positive-Intrinsic-Negative (PIN) photodiode manufactured by Hamamatsu Corporation. The performance of the detector is characterized at various operating temperatures and bias voltages for spectral response, NEP, response time, dynamic range, and linearity. A collinear lidar structure is designed to be integrated to NASA UC12 or B200 research aircrafts. This paper will describe the design of the airborne 2-micron pulsed IPDA lidar system; the lidar operation parameters; the wavelength pair selection; laser transmitter energy, pulse rate, beam divergence, double pulse generation and accurate frequency control; detector characterization; telescope design; lidar structure design; and lidar signal to noise ratio estimation. The first engineering flight is scheduled at the end of next year.
Design of Two RadWorks Storm Shelters for Solar Particle Event Shielding
NASA Technical Reports Server (NTRS)
Simon, Matthew; Cerro, Jeffery; Latorella, Kara; Clowdsley, Martha; Watson, Judith; Albertson, Cindy; Norman, Ryan; Le Boffe, Vincent; Walker, Steven
2014-01-01
In order to enable long-duration human exploration beyond low-Earth orbit, the risks associated with exposure of astronaut crews to space radiation must be mitigated with practical and affordable solutions. The space radiation environment beyond the magnetosphere is primarily a combination of two types of radiation: galactic cosmic rays (GCR) and solar particle events (SPE). While mitigating GCR exposure remains an open issue, reducing astronaut exposure to SPEs is achievable through material shielding because they are made up primarily of medium-energy protons. In order to ensure astronaut safety for long durations beyond low-Earth orbit, SPE radiation exposure must be mitigated. However, the increasingly demanding spacecraft propulsive performance for these ambitious missions requires minimal mass and volume radiation shielding solutions which leverage available multi-functional habitat structures and logistics as much as possible. This paper describes the efforts of NASA's RadWorks Advanced Exploration Systems (AES) Project to design two minimal mass SPE radiation shelter concepts leveraging available resources: one based upon reconfiguring habitat interiors to create a centralized protection area and one based upon augmenting individual crew quarters with waterwalls and logistics. Discussion items include the design features of the concepts, a radiation analysis of their implementations, an assessment of the parasitic mass of each concept, and the result of a human in the loop evaluation performed to drive out design and operational issues.
Choi, Changsoon; Choi, Moon Kee; Liu, Siyi; Kim, Min Sung; Park, Ok Kyu; Im, Changkyun; Kim, Jaemin; Qin, Xiaoliang; Lee, Gil Ju; Cho, Kyoung Won; Kim, Myungbin; Joh, Eehyung; Lee, Jongha; Son, Donghee; Kwon, Seung-Hae; Jeon, Noo Li; Song, Young Min; Lu, Nanshu; Kim, Dae-Hyeong
2017-11-21
Soft bioelectronic devices provide new opportunities for next-generation implantable devices owing to their soft mechanical nature that leads to minimal tissue damages and immune responses. However, a soft form of the implantable optoelectronic device for optical sensing and retinal stimulation has not been developed yet because of the bulkiness and rigidity of conventional imaging modules and their composing materials. Here, we describe a high-density and hemispherically curved image sensor array that leverages the atomically thin MoS 2 -graphene heterostructure and strain-releasing device designs. The hemispherically curved image sensor array exhibits infrared blindness and successfully acquires pixelated optical signals. We corroborate the validity of the proposed soft materials and ultrathin device designs through theoretical modeling and finite element analysis. Then, we propose the ultrathin hemispherically curved image sensor array as a promising imaging element in the soft retinal implant. The CurvIS array is applied as a human eye-inspired soft implantable optoelectronic device that can detect optical signals and apply programmed electrical stimulation to optic nerves with minimum mechanical side effects to the retina.
12 CFR 567.8 - Leverage ratio.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Leverage ratio. 567.8 Section 567.8 Banks and... § 567.8 Leverage ratio. (a) The minimum leverage capital requirement for a savings association assigned a composite rating of 1, as defined in § 516.3 of this chapter, shall consist of a ratio of core...
NASA Astrophysics Data System (ADS)
Hu, Yue-Houng; Rottmann, Joerg; Fueglistaller, Rony; Myronakis, Marios; Wang, Adam; Huber, Pascal; Shedlock, Daniel; Morf, Daniel; Baturin, Paul; Star-Lack, Josh; Berbeco, Ross
2018-02-01
While megavoltage cone-beam computed tomography (CBCT) using an electronic portal imaging device (EPID) provides many advantages over kilovoltage (kV) CBCT, clinical adoption is limited by its high doses. Multi-layer imager (MLI) EPIDs increase DQE(0) while maintaining high resolution. However, even well-designed, high-performance MLIs suffer from increased electronic noise from each readout, degrading low-dose image quality. To improve low-dose performance, shift-and-bin addition (ShiBA) imaging is proposed, leveraging the unique architecture of the MLI. ShiBA combines hardware readout-binning and super-resolution concepts, reducing electronic noise while maintaining native image sampling. The imaging performance of full-resolution (FR); standard, aligned binned (BIN); and ShiBA images in terms of noise power spectrum (NPS), electronic NPS, modulation transfer function (MTF), and the ideal observer signal-to-noise ratio (SNR)—the detectability index (d‧)—are compared. The FR 4-layer readout of the prototype MLI exhibits an electronic NPS magnitude 6-times higher than a state-of-the-art single layer (SLI) EPID. Although the MLI is built on the same readout platform as the SLI, with each layer exhibiting equivalent electronic noise, the multi-stage readout of the MLI results in electronic noise 50% higher than simple summation. Electronic noise is mitigated in both BIN and ShiBA imaging, reducing its total by ~12 times. ShiBA further reduces the NPS, effectively upsampling the image, resulting in a multiplication by a sinc2 function. Normalized NPS show that neither ShiBA nor BIN otherwise affects image noise. The LSF shows that ShiBA removes the pixilation artifact of BIN images and mitigates the effect of detector shift, but does not quantifiably improve the MTF. ShiBA provides a pre-sampled representation of the images, mitigating phase dependence. Hardware binning strategies lower the quantum noise floor, with 2 × 2 implementation reducing the dose at which DQE(0) degrades by 10% from 0.01 MU to 0.004 MU, representing 20% improvement in d‧.
NASA Astrophysics Data System (ADS)
Srikantha, Pirathayini
Today's electric grid is rapidly evolving to provision for heterogeneous system components (e.g. intermittent generation, electric vehicles, storage devices, etc.) while catering to diverse consumer power demand patterns. In order to accommodate this changing landscape, the widespread integration of cyber communication with physical components can be witnessed in all tenets of the modern power grid. This ubiquitous connectivity provides an elevated level of awareness and decision-making ability to system operators. Moreover, devices that were typically passive in the traditional grid are now `smarter' as these can respond to remote signals, learn about local conditions and even make their own actuation decisions if necessary. These advantages can be leveraged to reap unprecedented long-term benefits that include sustainable, efficient and economical power grid operations. Furthermore, challenges introduced by emerging trends in the grid such as high penetration of distributed energy sources, rising power demands, deregulations and cyber-security concerns due to vulnerabilities in standard communication protocols can be overcome by tapping onto the active nature of modern power grid components. In this thesis, distributed constructs in optimization and game theory are utilized to design the seamless real-time integration of a large number of heterogeneous power components such as distributed energy sources with highly fluctuating generation capacities and flexible power consumers with varying demand patterns to achieve optimal operations across multiple levels of hierarchy in the power grid. Specifically, advanced data acquisition, cloud analytics (such as prediction), control and storage systems are leveraged to promote sustainable and economical grid operations while ensuring that physical network, generation and consumer comfort requirements are met. Moreover, privacy and security considerations are incorporated into the core of the proposed designs and these serve to improve the resiliency of the future smart grid. It is demonstrated both theoretically and practically that the techniques proposed in this thesis are highly scalable and robust with superior convergence characteristics. These distributed and decentralized algorithms allow individual actuating nodes to execute self-healing and adaptive actions when exposed to changes in the grid so that the optimal operating state in the grid is maintained consistently.
RNA design rules from a massive open laboratory
Lee, Jeehyung; Kladwang, Wipapat; Lee, Minjae; Cantu, Daniel; Azizyan, Martin; Kim, Hanjoo; Limpaecher, Alex; Gaikwad, Snehal; Yoon, Sungroh; Treuille, Adrien; Das, Rhiju
2014-01-01
Self-assembling RNA molecules present compelling substrates for the rational interrogation and control of living systems. However, imperfect in silico models—even at the secondary structure level—hinder the design of new RNAs that function properly when synthesized. Here, we present a unique and potentially general approach to such empirical problems: the Massive Open Laboratory. The EteRNA project connects 37,000 enthusiasts to RNA design puzzles through an online interface. Uniquely, EteRNA participants not only manipulate simulated molecules but also control a remote experimental pipeline for high-throughput RNA synthesis and structure mapping. We show herein that the EteRNA community leveraged dozens of cycles of continuous wet laboratory feedback to learn strategies for solving in vitro RNA design problems on which automated methods fail. The top strategies—including several previously unrecognized negative design rules—were distilled by machine learning into an algorithm, EteRNABot. Over a rigorous 1-y testing phase, both the EteRNA community and EteRNABot significantly outperformed prior algorithms in a dozen RNA secondary structure design tests, including the creation of dendrimer-like structures and scaffolds for small molecule sensors. These results show that an online community can carry out large-scale experiments, hypothesis generation, and algorithm design to create practical advances in empirical science. PMID:24469816
RadWorks Storm Shelter Design for Solar Particle Event Shielding
NASA Technical Reports Server (NTRS)
Simon, Matthew A.; Cerro, Jeffrey; Clowdsley, Martha
2013-01-01
In order to enable long-duration human exploration beyond low-Earth orbit, the risks associated with exposure of astronaut crews to space radiation must be mitigated with practical and affordable solutions. The space radiation environment beyond the magnetosphere is primarily a combination of two types of radiation: galactic cosmic rays (GCR) and solar particle events (SPE). While mitigating GCR exposure remains an open issue, reducing astronaut exposure to SPEs is achievable through material shielding because they are made up primarily of medium-energy protons. In order to ensure astronaut safety for long durations beyond low-Earth orbit, SPE radiation exposure must be mitigated. However, the increasingly demanding spacecraft propulsive performance for these ambitious missions requires minimal mass and volume radiation shielding solutions which leverage available multi-functional habitat structures and logistics as much as possible. This paper describes the efforts of NASA's RadWorks Advanced Exploration Systems (AES) Project to design minimal mass SPE radiation shelter concepts leveraging available resources. Discussion items include a description of the shelter trade space, the prioritization process used to identify the four primary shelter concepts chosen for maturation, a summary of each concept's design features, a description of the radiation analysis process, and an assessment of the parasitic mass of each concept.
Adaptive Multi-scale PHM for Robotic Assembly Processes
Choo, Benjamin Y.; Beling, Peter A.; LaViers, Amy E.; Marvel, Jeremy A.; Weiss, Brian A.
2017-01-01
Adaptive multiscale prognostics and health management (AM-PHM) is a methodology designed to support PHM in smart manufacturing systems. As a rule, PHM information is not used in high-level decision-making in manufacturing systems. AM-PHM leverages and integrates component-level PHM information with hierarchical relationships across the component, machine, work cell, and production line levels in a manufacturing system. The AM-PHM methodology enables the creation of actionable prognostic and diagnostic intelligence up and down the manufacturing process hierarchy. Decisions are made with the knowledge of the current and projected health state of the system at decision points along the nodes of the hierarchical structure. A description of the AM-PHM methodology with a simulated canonical robotic assembly process is presented. PMID:28664161
Perceived image quality for autostereoscopic holograms in healthcare training
NASA Astrophysics Data System (ADS)
Goldiez, Brian; Abich, Julian; Carter, Austin; Hackett, Matthew
2017-03-01
The current state of dynamic light field holography requires further empirical investigation to ultimately advance this developing technology. This paper describes a user-centered design approach for gaining insight into the features most important to clinical personnel using emerging dynamic holographic displays. The approach describes the generation of a high quality holographic model of a simulated traumatic amputation above the knee using 3D scanning. Using that model, a set of static holographic prints will be created varying in color or monochrome, contrast ratio, and polygon density. Leveraging methods from image quality research, the goal for this paper is to describe an experimental approach wherein participants are asked to provide feedback regarding the elements previously mentioned in order to guide the ongoing evolution of holographic displays.
PCA leverage: outlier detection for high-dimensional functional magnetic resonance imaging data.
Mejia, Amanda F; Nebel, Mary Beth; Eloyan, Ani; Caffo, Brian; Lindquist, Martin A
2017-07-01
Outlier detection for high-dimensional (HD) data is a popular topic in modern statistical research. However, one source of HD data that has received relatively little attention is functional magnetic resonance images (fMRI), which consists of hundreds of thousands of measurements sampled at hundreds of time points. At a time when the availability of fMRI data is rapidly growing-primarily through large, publicly available grassroots datasets-automated quality control and outlier detection methods are greatly needed. We propose principal components analysis (PCA) leverage and demonstrate how it can be used to identify outlying time points in an fMRI run. Furthermore, PCA leverage is a measure of the influence of each observation on the estimation of principal components, which are often of interest in fMRI data. We also propose an alternative measure, PCA robust distance, which is less sensitive to outliers and has controllable statistical properties. The proposed methods are validated through simulation studies and are shown to be highly accurate. We also conduct a reliability study using resting-state fMRI data from the Autism Brain Imaging Data Exchange and find that removal of outliers using the proposed methods results in more reliable estimation of subject-level resting-state networks using independent components analysis. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Modular open RF architecture: extending VICTORY to RF systems
NASA Astrophysics Data System (ADS)
Melber, Adam; Dirner, Jason; Johnson, Michael
2015-05-01
Radio frequency products spanning multiple functions have become increasingly critical to the warfighter. Military use of the electromagnetic spectrum now includes communications, electronic warfare (EW), intelligence, and mission command systems. Due to the urgent needs of counterinsurgency operations, various quick reaction capabilities (QRCs) have been fielded to enhance warfighter capability. Although these QRCs were highly successfully in their respective missions, they were designed independently resulting in significant challenges when integrated on a common platform. This paper discusses how the Modular Open RF Architecture (MORA) addresses these challenges by defining an open architecture for multifunction missions that decomposes monolithic radio systems into high-level components with welldefined functions and interfaces. The functional decomposition maximizes hardware sharing while minimizing added complexity and cost due to modularization. MORA achieves significant size, weight and power (SWaP) savings by allowing hardware such as power amplifiers and antennas to be shared across systems. By separating signal conditioning from the processing that implements the actual radio application, MORA exposes previously inaccessible architecture points, providing system integrators with the flexibility to insert third-party capabilities to address technical challenges and emerging requirements. MORA leverages the Vehicular Integration for Command, Control, Communication, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR)/EW Interoperability (VICTORY) framework. This paper concludes by discussing how MORA, VICTORY and other standards such as OpenVPX are being leveraged by the U.S. Army Research, Development, and Engineering Command (RDECOM) Communications Electronics Research, Development, and Engineering Center (CERDEC) to define a converged architecture enabling rapid technology insertion, interoperability and reduced SWaP.
Davis, Mary V; Cannon, Margaret M; Reese, April; Lovette, Beth; Porterfield, Deborah S
2011-01-01
In 2006, we conducted case studies of 4 North Carolina local health departments (LHDs) that scored highly on an index of diabetes prevention and control performance, to explore characteristics that may serve as barriers or facilitators of diabetes prevention and control services. Case studies involving in-depth interviews were conducted at 4 LHDs. Sites were selected on the basis of 2 variables, known external funding for diabetes services and population size, that were associated with performance in diabetes prevention and control in a 2005 survey of all North Carolina LHDs. Fourteen interviews (individual and group) were conducted among 17 participants from the 4 LHDs. The main outcome measures were LHD characteristics that facilitate or hinder the performance of diabetes programs and services. Interviews revealed that all 4 high-performing LHDs had received some sort of funding from a source external to the LHD. Case study participants indicated that barriers to additional service delivery included low socioeconomic status of the population and lack of financial resources. Having a diabetes self-management education program that was recognized by the American Diabetes Association appeared to be a facilitator of diabetes services provision. Other facilitators were leadership and staff commitment, which appeared to facilitate the leveraging of partnerships and funding opportunities, leading to enhanced service delivery. The small number of LHDs participating in the study and the cross-sectional study design were limitations. Leadership, staff commitment, partnership leveraging, and funding appear to be associated with LHD performance in diabetes prevention and control services. These factors should be further studied in future public health systems and services research.
Ding, Kuan-Fu; Petricoin, Emanuel F; Finlay, Darren; Yin, Hongwei; Hendricks, William P D; Sereduk, Chris; Kiefer, Jeffrey; Sekulic, Aleksandar; LoRusso, Patricia M; Vuori, Kristiina; Trent, Jeffrey M; Schork, Nicholas J
2018-01-12
Cancer cell lines are often used in high throughput drug screens (HTS) to explore the relationship between cell line characteristics and responsiveness to different therapies. Many current analysis methods infer relationships by focusing on one aspect of cell line drug-specific dose-response curves (DRCs), the concentration causing 50% inhibition of a phenotypic endpoint (IC 50 ). Such methods may overlook DRC features and do not simultaneously leverage information about drug response patterns across cell lines, potentially increasing false positive and negative rates in drug response associations. We consider the application of two methods, each rooted in nonlinear mixed effects (NLME) models, that test the relationship relationships between estimated cell line DRCs and factors that might mitigate response. Both methods leverage estimation and testing techniques that consider the simultaneous analysis of different cell lines to draw inferences about any one cell line. One of the methods is designed to provide an omnibus test of the differences between cell line DRCs that is not focused on any one aspect of the DRC (such as the IC 50 value). We simulated different settings and compared the different methods on the simulated data. We also compared the proposed methods against traditional IC 50 -based methods using 40 melanoma cell lines whose transcriptomes, proteomes, and, importantly, BRAF and related mutation profiles were available. Ultimately, we find that the NLME-based methods are more robust, powerful and, for the omnibus test, more flexible, than traditional methods. Their application to the melanoma cell lines reveals insights into factors that may be clinically useful.
Kevlar: Transitioning Helix from Research to Practice
2015-04-01
protective transformations are applied to application binaries before they are deployed. Salient features of Kevlar include applying high- entropy ...variety of classes. Kevlar uses novel, fine-grained, high- entropy diversification transformations to prevent an attacker from successfully exploiting...Kevlar include applying high- entropy randomization techniques, automated program repairs, leveraging highly-optimized virtual machine technology, and in
13 CFR 107.1610 - Effect of prepayment or early redemption of Leverage on a Trust Certificate.
Code of Federal Regulations, 2012 CFR
2012-01-01
... (Leverage) Funding Leverage by Use of Sba-Guaranteed Trust Certificates (âtcsâ) § 107.1610 Effect of... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Effect of prepayment or early redemption of Leverage on a Trust Certificate. 107.1610 Section 107.1610 Business Credit and Assistance SMALL...
13 CFR 107.1610 - Effect of prepayment or early redemption of Leverage on a Trust Certificate.
Code of Federal Regulations, 2010 CFR
2010-01-01
... (Leverage) Funding Leverage by Use of Sba-Guaranteed Trust Certificates (âtcsâ) § 107.1610 Effect of... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Effect of prepayment or early redemption of Leverage on a Trust Certificate. 107.1610 Section 107.1610 Business Credit and Assistance SMALL...
13 CFR 108.1610 - Effect of prepayment or early redemption of Leverage on a Trust Certificate.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Companies (Leverage) Funding Leverage by Use of Sba Guaranteed Trust Certificates (âtcsâ) § 108.1610 Effect... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Effect of prepayment or early redemption of Leverage on a Trust Certificate. 108.1610 Section 108.1610 Business Credit and Assistance SMALL...
13 CFR 108.1610 - Effect of prepayment or early redemption of Leverage on a Trust Certificate.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Companies (Leverage) Funding Leverage by Use of Sba Guaranteed Trust Certificates (âtcsâ) § 108.1610 Effect... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Effect of prepayment or early redemption of Leverage on a Trust Certificate. 108.1610 Section 108.1610 Business Credit and Assistance SMALL...
Buying on margin, selling short in an agent-based market model
NASA Astrophysics Data System (ADS)
Zhang, Ting; Li, Honggang
2013-09-01
Credit trading, or leverage trading, which includes buying on margin and selling short, plays an important role in financial markets, where agents tend to increase their leverages for increased profits. This paper presents an agent-based asset market model to study the effect of the permissive leverage level on traders’ wealth and overall market indicators. In this model, heterogeneous agents can assume fundamental value-converging expectations or trend-persistence expectations, and their effective demands of assets depend both on demand willingness and wealth constraints, where leverage can relieve the wealth constraints to some extent. The asset market price is determined by a market maker, who watches the market excess demand, and is influenced by noise factors. By simulations, we examine market results for different leverage ratios. At the individual level, we focus on how the leverage ratio influences agents’ wealth accumulation. At the market level, we focus on how the leverage ratio influences changes in the asset price, volatility, and trading volume. Qualitatively, our model provides some meaningful results supported by empirical facts. More importantly, we find a continuous phase transition as we increase the leverage threshold, which may provide a further prospective of credit trading.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunhart-Lupo, Nicholas
2016-12-06
LibIsopach is a toolkit for high performance distributed immersive visualization, leveraging modern OpenGL. It features a multi-process scenegraph, explicit instance rendering, mesh generation, and three-dimensional user interaction event processing.
High Five: Building Capacity for School Excellence
ERIC Educational Resources Information Center
McCullen, Caroline
2006-01-01
In 2004, five North Carolina school districts combined forces with five corporate foundations to leverage their collective wisdom and develop regional strategies for school improvement. The result was the High Five Regional Partnership for High School Excellence, a corporate-public sector effort that had the common goal of improving graduation…
NASA Technical Reports Server (NTRS)
Smith, Andrew M.; Davis, R. Benjamin; LaVerde, Bruce T.; Fulcher, Clay W.; Jones, Douglas C.; Waldon, James M.; Craigmyle, Benjamin B.
2012-01-01
This validation study examines the effect on vibroacoustic response resulting from the installation of cable bundles on a curved orthogrid panel. Of interest is the level of damping provided by the installation of the cable bundles and whether this damping could be potentially leveraged in launch vehicle design. The results of this test are compared with baseline acoustic response tests without cables. Damping estimates from the measured response data are made using a new software tool that leverages a finite element model of the panel in conjunction with advanced optimization techniques. While the full test series is not yet complete, the first configuration of cable bundles that was assessed effectively increased the viscous critical damping fraction of the system by as much as 0.02 in certain frequency ranges.
The role of socio-technical principles in leveraging meaningful benefits from IT investments.
Doherty, Neil F
2014-03-01
In recent years there has been a great deal of academic and practitioner interest in the role of 'benefits realisation management' [BRM] approaches, as a means of proactively leveraging value from IT investments. This growing body of work owes a very considerable, but as yet unacknowledged, debt to the work of Ken Eason, and other early socio-technical theorists. Consequently, the aim of this paper is to demonstrate, using the literature, how many of the principles, practices and techniques of BRM have evolved either directly or indirectly from socio-technical approaches to systems design. In so doing, this article makes a further important contribution to the literature by explicitly identifying the underlying principles and key practices of benefits realisation management. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Buy, Build, or Steal: China’s Quest for Advanced Military Aviation Technologies
2011-12-01
its systems, resulting in long-term dependence on the seller in order to keep the aircraft flying or to update an older aircraft’s systems. This can...composites gained during the design of its indigenous Tejas Light Combat Aircraft ( LCA ).22 Russia has designed mostly metal aircraft and thus lacks...aircraft as an asset in the Cold War against the West. As a result , the Soviet Union did not fully employ its potential leverage and provided the PLA Air
Application of Additively Manufactured Components in Rocket Engine Turbopumps
NASA Technical Reports Server (NTRS)
Calvert, Marty, Jr.; Hanks, Andrew; Schmauch, Preston; Delessio, Steve
2015-01-01
The use of additive manufacturing technology has the potential to revolutionize the development of turbopump components in liquid rocket engines. When designing turbomachinery with the additive process there are several benefits and risks that are leveraged relative to a traditional development cycle. This topic explores the details and development of a 90,000 RPM Liquid Hydrogen Turbopump from which 90% of the parts were derived from the additive process. This turbopump was designed, developed and will be tested later this year at Marshall Space Flight Center.
Kris Gutiérrez: designing with and for diversity in the learning sciences
NASA Astrophysics Data System (ADS)
Jurow, A. Susan
2016-03-01
This article reviews the significance of the theoretical and practical contributions of Kris Gutiérrez to research on science education. Gutierrez's ideas about design and equity have inspired scholars to investigate how to leverage learners' everyday practices to make meaningful connections to disciplinary-based knowledge and skills. Her work has provided valuable direction on how to engage the challenges of organizing for more equitable futures through critical understanding of cultural diversity as a resource for transformative learning.
Motivation for DOC III: 64-bit digital optical computer
NASA Astrophysics Data System (ADS)
Guilfoyle, Peter S.
1991-09-01
This paper suggests a new class of digital logic. OptiComp has focused on a digital optical logic family in order to capitalize on the inherent benefits of optical computing, which include (1) high FAN-IN and FAN-OUT, (2) low power consumption, (3) high noise margin, (4) high algorithmic efficiency using 'smart' interconnects, (5) free space leverage of GIBP (gate interconnect bandwidth product). Other well-known secondary advantages of optical logic include (but are not limited to) zero capacitive loading of signals at a detector, zero cross-talk between signals, zero signal dispersion, minimal clock skew (a few picoseconds or less in an imaging system). The primary focus of this paper is to demonstrate how each of the five advantages can be used to leverage other logic family performance such as GaAs; the secondary attributes will be discussed only in the context of introducing the DOC III architecture.
Motivation for DOC III: 64-bit digital optical computer
NASA Astrophysics Data System (ADS)
Guilfoyle, Peter S.
1991-09-01
The objective of this paper is to motivate a new class of digital logic. OptiComp has focused on a digital optical logic family in order to capitalize on the inherent benefits of optical computing, which include: (1) high FAN-IN and FAN-OUT, (2) low power consumption, (3) high noise margin, (4) high algorithmic efficiency using 'smart' interconnects, (5) free space leverage of GIBP (gate interconnect bandwidth product). Other well-known secondary advantages of optical logic include (but are not limited to): zero capacitive loading of signals at a detector, zero cross-talk between signals, zero signal dispersion, and minimal clock skew (a few picoseconds or less in an imaging system). The primary focus of this paper is on demonstrating how each of the five advantages can be used to leverage other logic family performance such as GaAs; the secondary attributes will be discussed only in the context of introducing the DOC III architecture.
NASA Astrophysics Data System (ADS)
Fasel, Markus
2016-10-01
High-Performance Computing Systems are powerful tools tailored to support large- scale applications that rely on low-latency inter-process communications to run efficiently. By design, these systems often impose constraints on application workflows, such as limited external network connectivity and whole node scheduling, that make more general-purpose computing tasks, such as those commonly found in high-energy nuclear physics applications, more difficult to carry out. In this work, we present a tool designed to simplify access to such complicated environments by handling the common tasks of job submission, software management, and local data management, in a framework that is easily adaptable to the specific requirements of various computing systems. The tool, initially constructed to process stand-alone ALICE simulations for detector and software development, was successfully deployed on the NERSC computing systems, Carver, Hopper and Edison, and is being configured to provide access to the next generation NERSC system, Cori. In this report, we describe the tool and discuss our experience running ALICE applications on NERSC HPC systems. The discussion will include our initial benchmarks of Cori compared to other systems and our attempts to leverage the new capabilities offered with Cori to support data-intensive applications, with a future goal of full integration of such systems into ALICE grid operations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Renke; Jin, Shuangshuang; Chen, Yousu
This paper presents a faster-than-real-time dynamic simulation software package that is designed for large-size power system dynamic simulation. It was developed on the GridPACKTM high-performance computing (HPC) framework. The key features of the developed software package include (1) faster-than-real-time dynamic simulation for a WECC system (17,000 buses) with different types of detailed generator, controller, and relay dynamic models, (2) a decoupled parallel dynamic simulation algorithm with optimized computation architecture to better leverage HPC resources and technologies, (3) options for HPC-based linear and iterative solvers, (4) hidden HPC details, such as data communication and distribution, to enable development centered on mathematicalmore » models and algorithms rather than on computational details for power system researchers, and (5) easy integration of new dynamic models and related algorithms into the software package.« less
A High Performance Cloud-Based Protein-Ligand Docking Prediction Algorithm
Chen, Jui-Le; Yang, Chu-Sing
2013-01-01
The potential of predicting druggability for a particular disease by integrating biological and computer science technologies has witnessed success in recent years. Although the computer science technologies can be used to reduce the costs of the pharmaceutical research, the computation time of the structure-based protein-ligand docking prediction is still unsatisfied until now. Hence, in this paper, a novel docking prediction algorithm, named fast cloud-based protein-ligand docking prediction algorithm (FCPLDPA), is presented to accelerate the docking prediction algorithm. The proposed algorithm works by leveraging two high-performance operators: (1) the novel migration (information exchange) operator is designed specially for cloud-based environments to reduce the computation time; (2) the efficient operator is aimed at filtering out the worst search directions. Our simulation results illustrate that the proposed method outperforms the other docking algorithms compared in this paper in terms of both the computation time and the quality of the end result. PMID:23762864
A Converter from the Systems Biology Markup Language to the Synthetic Biology Open Language.
Nguyen, Tramy; Roehner, Nicholas; Zundel, Zach; Myers, Chris J
2016-06-17
Standards are important to synthetic biology because they enable exchange and reproducibility of genetic designs. This paper describes a procedure for converting between two standards: the Systems Biology Markup Language (SBML) and the Synthetic Biology Open Language (SBOL). SBML is a standard for behavioral models of biological systems at the molecular level. SBOL describes structural and basic qualitative behavioral aspects of a biological design. Converting SBML to SBOL enables a consistent connection between behavioral and structural information for a biological design. The conversion process described in this paper leverages Systems Biology Ontology (SBO) annotations to enable inference of a designs qualitative function.
Trifiletti, Daniel M.; Showalter, Timothy N.
2015-01-01
Several advances in large data set collection and processing have the potential to provide a wave of new insights and improvements in the use of radiation therapy for cancer treatment. The era of electronic health records, genomics, and improving information technology resources creates the opportunity to leverage these developments to create a learning healthcare system that can rapidly deliver informative clinical evidence. By merging concepts from comparative effectiveness research with the tools and analytic approaches of “big data,” it is hoped that this union will accelerate discovery, improve evidence for decision making, and increase the availability of highly relevant, personalized information. This combination offers the potential to provide data and analysis that can be leveraged for ultra-personalized medicine and high-quality, cutting-edge radiation therapy. PMID:26697409
Johns, Margaret A; Meyerkord-Belton, Cheryl L; Du, Yuhong; Fu, Haian
2014-03-01
The Emory Chemical Biology Discovery Center (ECBDC) aims to accelerate high throughput biology and translation of biomedical research discoveries into therapeutic targets and future medicines by providing high throughput research platforms to scientific collaborators worldwide. ECBDC research is focused at the interface of chemistry and biology, seeking to fundamentally advance understanding of disease-related biology with its HTS/HCS platforms and chemical tools, ultimately supporting drug discovery. Established HTS/HCS capabilities, university setting, and expertise in diverse assay formats, including protein-protein interaction interrogation, have enabled the ECBDC to contribute to national chemical biology efforts, empower translational research, and serve as a training ground for young scientists. With these resources, the ECBDC is poised to leverage academic innovation to advance biology and therapeutic discovery.
Trifiletti, Daniel M; Showalter, Timothy N
2015-01-01
Several advances in large data set collection and processing have the potential to provide a wave of new insights and improvements in the use of radiation therapy for cancer treatment. The era of electronic health records, genomics, and improving information technology resources creates the opportunity to leverage these developments to create a learning healthcare system that can rapidly deliver informative clinical evidence. By merging concepts from comparative effectiveness research with the tools and analytic approaches of "big data," it is hoped that this union will accelerate discovery, improve evidence for decision making, and increase the availability of highly relevant, personalized information. This combination offers the potential to provide data and analysis that can be leveraged for ultra-personalized medicine and high-quality, cutting-edge radiation therapy.
Leverage hadoop framework for large scale clinical informatics applications.
Dong, Xiao; Bahroos, Neil; Sadhu, Eugene; Jackson, Tommie; Chukhman, Morris; Johnson, Robert; Boyd, Andrew; Hynes, Denise
2013-01-01
In this manuscript, we present our experiences using the Apache Hadoop framework for high data volume and computationally intensive applications, and discuss some best practice guidelines in a clinical informatics setting. There are three main aspects in our approach: (a) process and integrate diverse, heterogeneous data sources using standard Hadoop programming tools and customized MapReduce programs; (b) after fine-grained aggregate results are obtained, perform data analysis using the Mahout data mining library; (c) leverage the column oriented features in HBase for patient centric modeling and complex temporal reasoning. This framework provides a scalable solution to meet the rapidly increasing, imperative "Big Data" needs of clinical and translational research. The intrinsic advantage of fault tolerance, high availability and scalability of Hadoop platform makes these applications readily deployable at the enterprise level cluster environment.
Build Engagement and Knowledge One Block at a Time with Minecraft
ERIC Educational Resources Information Center
Tromba, Peter
2013-01-01
The core of instruction is the interaction between the student, the content, and the teacher. Good instructional design accounts for the students' needs and interests by personalizing the core to each student. Video games and simulations are one way to meet student needs and leverage their interests for increased student learning. In the 2011-12…
ERIC Educational Resources Information Center
Ertle, Barbrina; Rosenfeld, Deborah; Presser, Ashley Lewis; Goldstein, Marion
2016-01-01
This paper presents a rationale for and description of the professional development system designed to help teachers understand and use the Birthday Party (BP) Mathematics Assessment, a standardized assessment with child-friendly birthday party themed tasks, and ultimately to leverage their learning from the BP to conduct their own meaningful…
Aguilar, Mario; Peot, Mark A; Zhou, Jiangying; Simons, Stephen; Liao, Yuwei; Metwalli, Nader; Anderson, Mark B
2012-03-01
The mammalian visual system is still the gold standard for recognition accuracy, flexibility, efficiency, and speed. Ongoing advances in our understanding of function and mechanisms in the visual system can now be leveraged to pursue the design of computer vision architectures that will revolutionize the state of the art in computer vision.
A Mobile Health Intervention to Sustain Recent Weight Loss
ERIC Educational Resources Information Center
Shaw, Ryan Jeffrey
2012-01-01
The goal of this study was to design an intervention that would help people stay in the continued response phase of the Behavior Change Process and help prevent weight relapse. Using the Behavior Change Process and regulatory focus theory, an intervention was developed that leveraged short message service (SMS) to deliver messages to people who…
ERIC Educational Resources Information Center
Martins, Jorge Tiago
2016-01-01
Purpose: Focusing on the specific context of two European old industrial regions--South Yorkshire (UK) and North Region of Portugal--this paper aims to identify and conceptualise a set of relational capabilities that business leaders perceive to play a key role in industrial rejuvenation. Design/Methodology/Approach: A qualitative research design…
Modeling fuel treatment leverage: Encounter rates, risk reduction, and suppression cost impacts
Matthew P. Thompson; Karin L. Riley; Dan Loeffler; Jessica R. Haas
2017-01-01
The primary theme of this study is the cost-effectiveness of fuel treatments at multiple scales of investment. We focused on the nexus of fuel management and suppression response planning, designing spatial fuel treatment strategies to incorporate landscape features that provide control opportunities that are relevant to fire operations. Our analysis explored the...
ERIC Educational Resources Information Center
Riley, Jason M.; Ellegood, William A.; Solomon, Stanislaus; Baker, Jerrine
2017-01-01
Purpose: This study aims to understand how mode of delivery, online versus face-to-face, affects comprehension when teaching operations management concepts via a simulation. Conceptually, the aim is to identify factors that influence the students' ability to learn and retain new concepts. Design/methodology/approach: Leveraging Littlefield…
ERIC Educational Resources Information Center
Chang, Yi-Hsing; Chen, Yen-Yi; Chen, Nian-Shing; Lu, You-Te; Fang, Rong-Jyue
2016-01-01
This study designs and implements an adaptive learning management system based on Felder and Silverman's Learning Style Model and the Mashup technology. In this system, Felder and Silverman's Learning Style model is used to assess students' learning styles, in order to provide adaptive learning to leverage learners' learning preferences.…
Getting Started with The Math Forum Problems of the Week Library. Teacher's Guide
ERIC Educational Resources Information Center
Math Forum @ Drexel, 2009
2009-01-01
The Math Forum Problems of the Week Library is designed to leverage the power of interactive technology to hold student interest while increasing their success as strategic thinkers. The Math Forum Library is an online source of non-routine challenges in which problem solving and mathematical communication are key elements of every problem. This…
ERIC Educational Resources Information Center
Stokes, Leah C.; Mildenberger, Matto; Savan, Beth; Kolenda, Brian
2012-01-01
Conducting a barriers analysis is an important first step when designing proenvironmental behavior change interventions. Yet, detailed information on common barriers to energy conservation campaigns remains unavailable. Using a pair of original surveys, we leverage the theory of planned behavior to report on the most important barriers for…
ERIC Educational Resources Information Center
Cole, Mikel; Puzio, Kelly; Keyes, Christopher; Jimenez, Robert; Pray, Lisa; David, Samuel
2012-01-01
This article presents findings drawn from the development of an intervention designed to leverage Spanish to improve English reading comprehension. Five teachers and 18 middle school English language learners (ELLs) in 2 urban middle schools participated in the project over the course of an academic year. Analysis of policy documents, interviews,…
ERIC Educational Resources Information Center
O'Brien, Kelsey; Forte, Michele; Mackey, Thomas P.; Jacobson, Trudi E.
2017-01-01
This article examines metaliteracy as a pedagogical model that leverages the assets of MOOC platforms to enhance self-regulated and self-empowered learning. Between 2013 and 2015, a collaborative teaching team within the State University of New York (SUNY) developed three MOOCs on three different platforms--connectivist, Coursera and Canvas--to…
Race to the Top Annual Performance Report. CFDA Number: 84.395
ERIC Educational Resources Information Center
US Department of Education, 2012
2012-01-01
The Department has developed a Race to the Top program review process that not only addresses the Department's responsibilities for fiscal and programmatic oversight, but is designed to identify areas to differentiate support based on individual State needs, as well as certain topics where States can leverage work with each other and with experts…
A Planning Framework for Crafting the Required-Curriculum Phase of an MBA Program
ERIC Educational Resources Information Center
Haskins, Mark E.
2005-01-01
This article introduces a planning framework for designing that part of an MBA program during which students take the bulk, if not all, of their required courses. The framework highlights three student venues that can be jointly leveraged for enhanced student learning. Those venues are the required curriculum, students' affinity groups, and the…
ERIC Educational Resources Information Center
Nickelson, Jen; Alfonso, Moya L.; McDermott, Robert J.; Bumpus, Elizabeth C.; Bryant, Carol A.; Baldwin, Julie A.
2011-01-01
Creating community-based opportunities for youth to be physically active is challenging for many municipalities. A Lexington, Kentucky community coalition designed and piloted a physical activity program, "VERB[TM] summer scorecard (VSS)", leveraging the brand equity of the national VERB[TM]--It's What You Do! campaign. Key elements of…
FRAN: financial ratio analysis and more (Version 2.0 for Windows)
Bruce G. Hansen; Arnold J., Jr. Palmer
1999-01-01
FRAN is a computer-based, stand-alone program designed to generate important financial and operating ratios from tax and wage forms filed with the Internal Revenue Service. FRAN generates standard profitability, financial/leverage, liquidity/solvency, and activity ratios, as well as unique measures of workforce and capital cost and acquisition. Information produced by...
1992-05-01
methodology, knowledge acquisition, 140 requirements definition, information systems, information engineering, 16. PRICE CODE systems engineering...and knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be...evolve towards an information -integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key
ERIC Educational Resources Information Center
Henry, Wesley L. C.
2017-01-01
Rural America is rapidly becoming more diverse, yet rural communities remain different from their urban and suburban counterparts. Despite several decades of economic hardship in rural areas, rural schools are under researched by scholars and under prioritized by policymakers. Therefore, this study was designed to better understand how school and…
Keep It R.E.A.L.!: Relevant, Engaging, and Affirming Literacy for Adolescent English Learners
ERIC Educational Resources Information Center
Stewart, Mary Amanda
2017-01-01
This book introduces a set of pedagogical practices designed to assist adolescent English learners in developing their English skills in a way that honors and leverages their native languages and cultures. Responding to the linguistic and educational diversity of adolescents, the R.E.A.L. (Relevant, Engaging, and Affirming Literacy) method offers…
ERIC Educational Resources Information Center
Fogleman, Jay; Niedbala, Mona Anne; Bedell, Francesca
2013-01-01
How do educators leverage students' fluency with ubiquitous information and communication sources to foster a scholarly digital ethos? This article describes a blended learning environment designed to engage first-year students in 21st-century emerging forms of scholarship and publication. The authors describe an effort to reverse the millennials'…
Protein leverage affects energy intake of high-protein diets in humans.
Martens, Eveline A; Lemmens, Sofie G; Westerterp-Plantenga, Margriet S
2013-01-01
The protein leverage hypothesis requires specific evidence that protein intake is regulated more strongly than energy intake. The objective was to determine ad libitum energy intake, body weight changes, and appetite profile in response to protein-to-carbohydrate + fat ratio over 12 consecutive days and in relation to age, sex, BMI, and type of protein. A 12-d randomized crossover study was performed in 40 men and 39 women [mean ± SD age: 34.0 ± 17.6 y; BMI (in kg/m(2)): 23.7 ± 3.4] with the use of diets containing 5%, 15%, and 30% of energy from protein from a milk or plant source. Protein-content effects did not differ by age, sex, BMI, or type of protein. Total energy intake was significantly lower in the high-protein (7.21 ± 3.08 MJ/d) condition than in the low-protein (9.33 ± 3.52 MJ/d) and normal-protein (9.62 ± 3.51 MJ/d) conditions (P = 0.001), which was predominantly the result of a lower energy intake from meals (P = 0.001). Protein intake varied directly according to the amount of protein in the diet (P = 0.001). The AUC of visual analog scale appetite ratings did not differ significantly, yet fluctuations in hunger (P = 0.019) and desire to eat (P = 0.026) over the day were attenuated in the high-protein condition compared with the normal-protein condition. We found evidence to support the protein leverage hypothesis in that individuals underate relative to energy balance from diets containing a higher protein-to-carbohydrate + fat ratio. No evidence for protein leverage effects from diets containing a lower ratio of protein to carbohydrate + fat was obtained. It remains to be shown whether a relatively low protein intake would cause overeating or would be the effect of overeating of carbohydrate and fat. The study was registered at clinicaltrials.gov as NCT01320189.
Parental Involvement Protects against Self-Medication Behaviors during the High School Transition
Gottfredson, Nisha C.; Hussong, Andrea M.
2011-01-01
We examined how drinking patterns change as adolescents transition to high school, particularly as a function of parental involvement. Stress associated with the transition to high school may deplete psychological resources for coping with negative daily emotions in an environment when opportunities to drink are more common. A cohort of elevated-risk middle school students completed daily negative affect (sadness, worry, anger, and stress) and alcohol use assessments before and after the transition to high school, resulting in a measurement burst design. Adolescents who reported less parental involvement were at higher risk for drinking on any given day. After (but not before) the transition to high school, daily within-person fluctuations of sadness predicted an increased probability of same-day alcohol use for adolescents who reported that their parents were minimally involved in their lives. The other negative affect indicators were not predictive of use. Our results suggest that the transition to high school may represent an important intervention leverage point, particularly for adolescents who lack adequate parental support to help them cope with day-to-day changes in sadness. PMID:21880433
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-22
... in inverse ETFs, the Fund will not invest in leveraged or inverse leveraged (e.g., 2X or -3X) ETFs..., according to the Exchange, the Fund's ownership and control of the First Trust Subsidiary will prevent the.... The Fund may invest in inverse ETFs, but it will not invest in leveraged or inverse leveraged ETFs...
NASA Astrophysics Data System (ADS)
Suwondo; Darmadi; Yunus, M.
2018-01-01
The development process has resulted in deforestation. A comprehensive study is needed to obtain an objective solution by integrating the ecological dimension and human dimension. This study was conducted within Balai Raja Wildlife Reserve (BRWR), Bengkalis Regency, Riau Province, Indonesia. We used the social-ecological systems (SES) approach based on local characteristics, categorized into ecological status, social status and actors. Each factoris ranked using Multi-Dimensional Scaling (MDS).BRWR sustainability levels are in moderate condition. The ecological dimension is in a less sustainable state, with leverage: (1) forest conversion; (2) local ecological knowledge; (3) high conservation value. The social dimension is in a less sustainable state, with leverage: (1) community empowerment; (2) social conflict; (3) participation in landscape management. Dimensions actors are on a fairly sustainable status, with leverage: (1) institutional interaction; (2) stakeholder’s commitment; (3) law enforcement. We recommend strengthening community empowerment, local ecological knowledge, interaction, and stakeholder commitment
Observatory software for the Maunakea Spectroscopic Explorer
NASA Astrophysics Data System (ADS)
Vermeulen, Tom; Isani, Sidik; Withington, Kanoa; Ho, Kevin; Szeto, Kei; Murowinski, Rick
2016-07-01
The Canada-France-Hawaii Telescope is currently in the conceptual design phase to redevelop its facility into the new Maunakea Spectroscopic Explorer (MSE). MSE is designed to be the largest non-ELT optical/NIR astronomical telescope, and will be a fully dedicated facility for multi-object spectroscopy over a broad range of spectral resolutions. This paper outlines the software and control architecture envisioned for the new facility. The architecture will be designed around much of the existing software infrastructure currently used at CFHT as well as the latest proven opensource software. CFHT plans to minimize risk and development time by leveraging existing technology.
Microstructure design for fast oxygen conduction
Aidhy, Dilpuneet S.; Weber, William J.
2015-11-11
Research from the last decade has shown that in designing fast oxygen conducting materials for electrochemical applications has largely shifted to microstructural features, in contrast to material-bulk. In particular, understanding oxygen energetics in heterointerface materials is currently at the forefront, where interfacial tensile strain is being considered as the key parameter in lowering oxygen migration barriers. Nanocrystalline materials with high densities of grain boundaries have also gathered interest that could possibly allow leverage over excess volume at grain boundaries, providing fast oxygen diffusion channels similar to those previously observed in metals. In addition, near-interface phase transformations and misfit dislocations aremore » other microstructural phenomenon/features that are being explored to provide faster diffusion. In this review, the current understanding on oxygen energetics, i.e., thermodynamics and kinetics, originating from these microstructural features is discussed. Moreover, our experimental observations, theoretical predictions and novel atomistic mechanisms relevant to oxygen transport are highlighted. In addition, the interaction of dopants with oxygen vacancies in the presence of these new microstructural features, and their future role in the design of future fast-ion conductors, is outlined.« less
Citrate chemistry and biology for biomaterials design.
Ma, Chuying; Gerhard, Ethan; Lu, Di; Yang, Jian
2018-05-04
Leveraging the multifunctional nature of citrate in chemistry and inspired by its important role in biological tissues, a class of highly versatile and functional citrate-based materials (CBBs) has been developed via facile and cost-effective polycondensation. CBBs exhibiting tunable mechanical properties and degradation rates, together with excellent biocompatibility and processability, have been successfully applied in vitro and in vivo for applications ranging from soft to hard tissue regeneration, as well as for nanomedicine designs. We summarize in the review, chemistry considerations for CBBs design to tune polymer properties and to introduce functionality with a focus on the most recent advances, biological functions of citrate in native tissues with the new notion of degradation products as cell modulator highlighted, and the applications of CBBs in wound healing, nanomedicine, orthopedic, cardiovascular, nerve and bladder tissue engineering. Given the expansive evidence for citrate's potential in biology and biomaterial science outlined in this review, it is expected that citrate based materials will continue to play an important role in regenerative engineering. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lang, Hans-Dieter; Sarris, Costas D.
2017-09-01
In magnetically mediated hyperthermia (MMH), an externally applied alternating magnetic field interacts with a mediator (such as a magnetic nanoparticle or an implant) inside the body to heat up the tissue in its proximity. Producing heat via induced currents in this manner is strikingly similar to wireless power transfer (WPT) for implants, where power is transferred from a transmitter outside of the body to an implanted receiver, in most cases via magnetic fields as well. Leveraging this analogy, a systematic method to design MMH implants for optimal heating efficiency is introduced, akin to the design of WPT systems for optimal power transfer efficiency. This paper provides analytical formulas for the achievable heating efficiency bounds as well as the optimal operating frequency and the implant material. Multiphysics simulations validate the approach and further demonstrate that optimization with respect to maximum heating efficiency is accompanied by minimizing heat delivery to healthy tissue. This is a property that is highly desirable when considering MMH as a key component or complementary method of cancer treatment and other applications.
Emulating multiple inheritance in Fortran 2003/2008
Morris, Karla
2015-01-24
Although the high-performance computing (HPC) community increasingly embraces object-oriented programming (OOP), most HPC OOP projects employ the C++ programming language. Until recently, Fortran programmers interested in mining the benefits of OOP had to emulate OOP in Fortran 90/95. The advent of widespread compiler support for Fortran 2003 now facilitates explicitly constructing object-oriented class hierarchies via inheritance and leveraging related class behaviors such as dynamic polymorphism. Although C++ allows a class to inherit from multiple parent classes, Fortran and several other OOP languages restrict or prohibit explicit multiple inheritance relationships in order to circumvent several pitfalls associated with them. Nonetheless, whatmore » appears as an intrinsic feature in one language can be modeled as a user-constructed design pattern in another language. The present paper demonstrates how to apply the facade structural design pattern to support a multiple inheritance class relationship in Fortran 2003. As a result, the design unleashes the power of the associated class relationships for modeling complicated data structures yet avoids the ambiguities that plague some multiple inheritance scenarios.« less
Performance evaluation of electro-optic effect based graphene transistors
NASA Astrophysics Data System (ADS)
Gupta, Gaurav; Abdul Jalil, Mansoor Bin; Yu, Bin; Liang, Gengchiau
2012-09-01
Despite the advantages afforded by the unique electronic properties of graphene, the absence of a bandgap has limited its applicability in logic devices. This has led to a study on electro-optic behavior in graphene for novel device operations, beyond the conventional field effect, to meet the requirements of ultra-low power and high-speed logic transistors. Recently, two potential designs have been proposed to leverage on this effect and open a virtual bandgap for ballistic transport in the graphene channel. The first one implements a barrier in the centre of the channel, whereas the second incorporates a tilted gate junction. In this paper, we computationally evaluate the relative device performance of these two designs, in terms of subthreshold slope (SS) and ION/IOFF ratio under different temperature and voltage bias, for a defect-free graphene channel. Our calculations employ pure optical modeling for low field electron transport under the constraints of device anatomy. The calculated results show that the two designs are functionally similar and are able to provide SS smaller than 60 mV per decade. Both designs show similar device performance but marginally top one another under different operating constraints. Our results could serve as a guide to circuit designers in selecting an appropriate design as per their system specifications and requirements.
Performance evaluation of electro-optic effect based graphene transistors.
Gupta, Gaurav; Jalil, Mansoor Bin Abdul; Yu, Bin; Liang, Gengchiau
2012-10-21
Despite the advantages afforded by the unique electronic properties of graphene, the absence of a bandgap has limited its applicability in logic devices. This has led to a study on electro-optic behavior in graphene for novel device operations, beyond the conventional field effect, to meet the requirements of ultra-low power and high-speed logic transistors. Recently, two potential designs have been proposed to leverage on this effect and open a virtual bandgap for ballistic transport in the graphene channel. The first one implements a barrier in the centre of the channel, whereas the second incorporates a tilted gate junction. In this paper, we computationally evaluate the relative device performance of these two designs, in terms of subthreshold slope (SS) and I(ON)/I(OFF) ratio under different temperature and voltage bias, for a defect-free graphene channel. Our calculations employ pure optical modeling for low field electron transport under the constraints of device anatomy. The calculated results show that the two designs are functionally similar and are able to provide SS smaller than 60 mV per decade. Both designs show similar device performance but marginally top one another under different operating constraints. Our results could serve as a guide to circuit designers in selecting an appropriate design as per their system specifications and requirements.
NASA Astrophysics Data System (ADS)
Pan, Zhiyuan; Liu, Li
2018-02-01
In this paper, we extend the GARCH-MIDAS model proposed by Engle et al. (2013) to account for the leverage effect in short-term and long-term volatility components. Our in-sample evidence suggests that both short-term and long-term negative returns can cause higher future volatility than positive returns. Out-of-sample results show that the predictive ability of GARCH-MIDAS is significantly improved after taking the leverage effect into account. The leverage effect for short-term volatility component plays more important role than the leverage effect for long-term volatility component in affecting out-of-sample forecasting performance.
Leader Development: Leveraging Combat Power Through Leadership,
1993-12-17
performance for all rangers, officer and enlisted. 68. Stephen R. Covey, The 7 Habits of Highly Effective People : Restoring the Character Ethic. (New...Orange, New Jersey; Leadership Library of America, 1990). Covey, Stephen R. The 7 Habits of Highly Effective People : Restoring the Character Ethic. (New
ERIC Educational Resources Information Center
Kiany, Gholam Reza; Shayestefar, Parvaneh; Samar, Reza Ghafar; Akbari, Ramin
2013-01-01
A steady stream of studies on high-stakes tests such as University Entrance Examinations (UEEs) suggests that high-stakes tests reforms serve as the leverage for promoting quality of learning, standards of teaching, and credible forms of accountability. However, such remediation is often not as effective as hoped and success is not necessarily…
A Study of Particle Beam Spin Dynamics for High Precision Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fiedler, Andrew J.
In the search for physics beyond the Standard Model, high precision experiments to measure fundamental properties of particles are an important frontier. One group of such measurements involves magnetic dipole moment (MDM) values as well as searching for an electric dipole moment (EDM), both of which could provide insights about how particles interact with their environment at the quantum level and if there are undiscovered new particles. For these types of high precision experiments, minimizing statistical uncertainties in the measurements plays a critical role. \\\\ \\indent This work leverages computer simulations to quantify the effects of statistical uncertainty for experimentsmore » investigating spin dynamics. In it, analysis of beam properties and lattice design effects on the polarization of the beam is performed. As a case study, the beam lines that will provide polarized muon beams to the Fermilab Muon \\emph{g}-2 experiment are analyzed to determine the effects of correlations between the phase space variables and the overall polarization of the muon beam.« less
NASA and Industry Benefits of ACTS High Speed Network Interoperability Experiments
NASA Technical Reports Server (NTRS)
Zernic, M. J.; Beering, D. R.; Brooks, D. E.
2000-01-01
This paper provides synopses of the design. implementation, and results of key high data rate communications experiments utilizing the technologies of NASA's Advanced Communications Technology Satellite (ACTS). Specifically, the network protocol and interoperability performance aspects will be highlighted. The objectives of these key experiments will be discussed in their relevant context to NASA missions, as well as, to the comprehensive communications industry. Discussion of the experiment implementation will highlight the technical aspects of hybrid network connectivity, a variety of high-speed interoperability architectures, a variety of network node platforms, protocol layers, internet-based applications, and new work focused on distinguishing between link errors and congestion. In addition, this paper describes the impact of leveraging government-industry partnerships to achieve technical progress and forge synergistic relationships. These relationships will be the key to success as NASA seeks to combine commercially available technology with its own internal technology developments to realize more robust and cost effective communications for space operations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shanks, Katherine S.; Philipp, Hugh T.; Weiss, Joel T.
Experiments at storage ring light sources as well as at next-generation light sources increasingly require detectors capable of high dynamic range operation, combining low-noise detection of single photons with large pixel well depth. XFEL sources in particular provide pulse intensities sufficiently high that a purely photon-counting approach is impractical. The High Dynamic Range Pixel Array Detector (HDR-PAD) project aims to provide a dynamic range extending from single-photon sensitivity to 10{sup 6} photons/pixel in a single XFEL pulse while maintaining the ability to tolerate a sustained flux of 10{sup 11} ph/s/pixel at a storage ring source. Achieving these goals involves themore » development of fast pixel front-end electronics as well as, in the XFEL case, leveraging the delayed charge collection due to plasma effects in the sensor. A first prototype of essential electronic components of the HDR-PAD readout ASIC, exploring different options for the pixel front-end, has been fabricated. Here, the HDR-PAD concept and preliminary design will be described.« less
Design of an ergonomic ultrasound system: accommodation of user anthropometrics.
Park, Sung; Yim, Jinho; Lee, Goeun
2012-01-01
Long-term use of medical imaging devices requires significant improvements to the user experience. One factor that impact upon such experience is whether the device is ergonomically built, ecologically designed, and leverages the current medical practice. In this research, we took a holistic and systematic approach to design an effective and biomechanically-fit ultrasound system. Research methods from behavior science (e.g., contextual inquiry, pseudo experiments) had been adopted to involve the users (sonographers) early in the design process. The end results - product design guideline for a cart type ultrasound system and control panel layout - were reviewed by the users and adjusted so that the design is within the range of an acceptable learning curve while maintaining innovativeness, a differentiated value over competitor's ultrasound devices.
Karwowski, Waldemar; Ahram, Tareq Z
2012-01-01
In order to leverage individual and organizational learning and to remain competitive in current turbulent markets it is important for employees, managers, planners and leaders to perform at high levels over time. Employee competence and skills are extremely important matters in view of the general shortage of talent and the mobility of employees with talent. Two factors emerged to have the greatest impact on the competitiveness of complex service systems: improving managerial and employee's knowledge attainment for skills, and improving the training and development of the workforce. This paper introduces the knowledge-based user-centered service design approach for sustainable skill and performance improvement in education, design and modeling of the next generation of complex service systems. The rest of the paper cover topics in human factors and sustainable business process modeling for the service industry, and illustrates the user-centered service system development cycle with the integration of systems engineering concepts in service systems. A roadmap for designing service systems of the future is discussed. The framework introduced in this paper is based on key user-centered design principles and systems engineering applications to support service competitiveness.
Kevlar: Transitioning Helix for Research to Practice
2016-03-01
entropy randomization techniques, automated program repairs leveraging highly-optimized virtual machine technology, and developing a novel framework...attacker from exploiting residual vulnerabilities in a wide variety of classes. Helix/Kevlar uses novel, fine-grained, high- entropy diversification...the Air Force, and IARPA). Salient features of Helix/Kevlar include developing high- entropy randomization techniques, automated program repairs
ERIC Educational Resources Information Center
da Silveira, Pedro Rodrigo Castro
2014-01-01
This thesis describes the development and deployment of a cyberinfrastructure for distributed high-throughput computations of materials properties at high pressures and/or temperatures--the Virtual Laboratory for Earth and Planetary Materials--VLab. VLab was developed to leverage the aggregated computational power of grid systems to solve…
Rewarding safe behavior: strategies for change.
Fell-Carlson, Deborah
2004-12-01
Effective, sustainable safety incentives are integrated into a performance management system designed to encourage long term behavior change. Effective incentive program design integrates the fundamental considerations of compensation (i.e., valence, instrumentality, expectancy, equity) with behavior change theory in the context of a strong merit based performance management system. Clear expectations are established and communicated from the time applicants apply for the position. Feedback and social recognition are leveraged and used as rewards, in addition to financial incentives built into the compensation system and offered periodically as short term incentives. Rewards are tied to specific objectives intended to influence specific behaviors. Objectives are designed to challenge employees, providing opportunities to grow and enhance their sense of belonging. Safety contests and other awareness activities are most effective when used to focus safety improvement efforts on specific behaviors or processes, for a predetermined period of time, in the context of a comprehensive safety system. Safety incentive programs designed around injury outcomes can result in unintended, and undesirable, consequences. Safety performance can be leveraged by integrating safety into corporate cultural indicators. Symbols of safety remind employees of corporate safety goals and objectives (e.g., posted safety goals and integrating safety into corporate mission and vision). Rites and ceremonies provide opportunities for social recognition and feedback and demonstrate safety is a corporate value. Feedback opportunities, rewards, and social recognition all provide content for corporate legends, those stories embellished over time, that punctuate the overall system of organizational norms, and provide examples of the organizational safety culture in action.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2009-04-01
This report documents implementation strategies to leverage public and private resources for the development of an adequate national security workforce as part of the National Security Preparedness Project (NSPP), being performed under a U.S. Department of Energy (DOE)/National Nuclear Security Administration (NNSA) grant. There are numerous efforts across the United States to develop a properly skilled and trained national security workforce. Some of these efforts are the result of the leveraging of public and private dollars. As budget dollars decrease and the demand for a properly skilled and trained national security workforce increases, it will become even more important tomore » leverage every education and training dollar. This report details some of the efforts that have been implemented to leverage public and private resources, as well as implementation strategies to further leverage public and private resources.« less
On the Suitability of MPI as a PGAS Runtime
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daily, Jeffrey A.; Vishnu, Abhinav; Palmer, Bruce J.
2014-12-18
Partitioned Global Address Space (PGAS) models are emerging as a popular alternative to MPI models for designing scalable applications. At the same time, MPI remains a ubiquitous communication subsystem due to its standardization, high performance, and availability on leading platforms. In this paper, we explore the suitability of using MPI as a scalable PGAS communication subsystem. We focus on the Remote Memory Access (RMA) communication in PGAS models which typically includes {\\em get, put,} and {\\em atomic memory operations}. We perform an in-depth exploration of design alternatives based on MPI. These alternatives include using a semantically-matching interface such as MPI-RMA,more » as well as not-so-intuitive interfaces such as MPI two-sided with a combination of multi-threading and dynamic process management. With an in-depth exploration of these alternatives and their shortcomings, we propose a novel design which is facilitated by the data-centric view in PGAS models. This design leverages a combination of highly tuned MPI two-sided semantics and an automatic, user-transparent split of MPI communicators to provide asynchronous progress. We implement the asynchronous progress ranks approach and other approaches within the Communication Runtime for Exascale which is a communication subsystem for Global Arrays. Our performance evaluation spans pure communication benchmarks, graph community detection and sparse matrix-vector multiplication kernels, and a computational chemistry application. The utility of our proposed PR-based approach is demonstrated by a 2.17x speed-up on 1008 processors over the other MPI-based designs.« less
A 17 degree of freedom anthropomorphic manipulator
NASA Technical Reports Server (NTRS)
Vold, Havard I.; Karlen, James P.; Thompson, Jack M., Jr.; Farrell, James D.; Eismann, Paul H.
1989-01-01
A 17 axis anthropomorphic manipulator, providing coordinated control of two seven degree of freedom arms mounted on a three degree of freedom torso-waist assembly, is presented. This massively redundant telerobot, designated the Robotics Research K/B-2017 Dexterous Manipulator, employs a modular mechanism design with joint-mounted actuators based on brushless motors and harmonic drive gear reducers. Direct joint torque control at the servo level causes these high-output joint drives to behave like direct-drive actuators, facilitating the implementation of an effective impedance control scheme. The redundant, but conservative motion control system models the manipulator as a spring-loaded linkage with viscous damping and rotary inertia at each joint. This approach allows for real time, sensor-driven control of manipulator pose using a hierarchy of competing rules, or objective functions, to avoid unplanned collisions with objects in the workplace, to produce energy-efficient, graceful motion, to increase leverage, to control effective impedance at the tool or to favor overloaded joints.
Java Tool Framework for Automation of Hardware Commissioning and Maintenance Procedures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ho, J C; Fisher, J M; Gordon, J B
2007-10-02
The National Ignition Facility (NIF) is a 192-beam laser system designed to study high energy density physics. Each beam line contains a variety of line replaceable units (LRUs) that contain optics, stepping motors, sensors and other devices to control and diagnose the laser. During commissioning and subsequent maintenance of the laser, LRUs undergo a qualification process using the Integrated Computer Control System (ICCS) to verify and calibrate the equipment. The commissioning processes are both repetitive and tedious when we use remote manual computer controls, making them ideal candidates for software automation. Maintenance and Commissioning Tool (MCT) software was developed tomore » improve the efficiency of the qualification process. The tools are implemented in Java, leveraging ICCS services and CORBA to communicate with the control devices. The framework provides easy-to-use mechanisms for handling configuration data, task execution, task progress reporting, and generation of commissioning test reports. The tool framework design and application examples will be discussed.« less
Multisite Assessment of Nursing Continuing Education Learning Needs Using an Electronic Tool.
Winslow, Susan; Jackson, Stephanie; Cook, Lesley; Reed, Joanne Williams; Blakeney, Keshia; Zimbro, Kathie; Parker, Cindy
2016-02-01
A continued education needs assessment and associated education plan are required for organizations on the journey for American Nurses Credentialing Center Magnet® designation. Leveraging technology to support the assessment and analysis of continuing education needs was a new venture for a 12-hospital regional health system. The purpose of this performance improvement project was to design and conduct an enhanced process to increase the efficiency and effectiveness of gathering data on nurses' preferences and increase nurse satisfaction with the learner assessment portion of the process. Educators trialed the use of a standardized approach via an electronic survey tool to replace the highly variable processes previously used. Educators were able to view graphical summary of responses by category and setting, which substantially decreased analysis and action planning time for education implementation plans at the system, site, or setting level. Based on these findings, specific continuing education action plans were drafted for each category and classification of nurses. Copyright 2016, SLACK Incorporated.
Aerodynamic Analysis of the Truss-Braced Wing Aircraft Using Vortex-Lattice Superposition Approach
NASA Technical Reports Server (NTRS)
Ting, Eric Bi-Wen; Reynolds, Kevin Wayne; Nguyen, Nhan T.; Totah, Joseph J.
2014-01-01
The SUGAR Truss-BracedWing (TBW) aircraft concept is a Boeing-developed N+3 aircraft configuration funded by NASA ARMD FixedWing Project. This future generation transport aircraft concept is designed to be aerodynamically efficient by employing a high aspect ratio wing design. The aspect ratio of the TBW is on the order of 14 which is significantly greater than those of current generation transport aircraft. This paper presents a recent aerodynamic analysis of the TBW aircraft using a conceptual vortex-lattice aerodynamic tool VORLAX and an aerodynamic superposition approach. Based on the underlying linear potential flow theory, the principle of aerodynamic superposition is leveraged to deal with the complex aerodynamic configuration of the TBW. By decomposing the full configuration of the TBW into individual aerodynamic lifting components, the total aerodynamic characteristics of the full configuration can be estimated from the contributions of the individual components. The aerodynamic superposition approach shows excellent agreement with CFD results computed by FUN3D, USM3D, and STAR-CCM+.
Software for the Integration of Multiomics Experiments in Bioconductor.
Ramos, Marcel; Schiffer, Lucas; Re, Angela; Azhar, Rimsha; Basunia, Azfar; Rodriguez, Carmen; Chan, Tiffany; Chapman, Phil; Davis, Sean R; Gomez-Cabrero, David; Culhane, Aedin C; Haibe-Kains, Benjamin; Hansen, Kasper D; Kodali, Hanish; Louis, Marie S; Mer, Arvind S; Riester, Markus; Morgan, Martin; Carey, Vince; Waldron, Levi
2017-11-01
Multiomics experiments are increasingly commonplace in biomedical research and add layers of complexity to experimental design, data integration, and analysis. R and Bioconductor provide a generic framework for statistical analysis and visualization, as well as specialized data classes for a variety of high-throughput data types, but methods are lacking for integrative analysis of multiomics experiments. The MultiAssayExperiment software package, implemented in R and leveraging Bioconductor software and design principles, provides for the coordinated representation of, storage of, and operation on multiple diverse genomics data. We provide the unrestricted multiple 'omics data for each cancer tissue in The Cancer Genome Atlas as ready-to-analyze MultiAssayExperiment objects and demonstrate in these and other datasets how the software simplifies data representation, statistical analysis, and visualization. The MultiAssayExperiment Bioconductor package reduces major obstacles to efficient, scalable, and reproducible statistical analysis of multiomics data and enhances data science applications of multiple omics datasets. Cancer Res; 77(21); e39-42. ©2017 AACR . ©2017 American Association for Cancer Research.
NASA Astrophysics Data System (ADS)
Sato, Takashi; Honma, Michio; Itoh, Hiroyuki; Iriki, Nobuyuki; Kobayashi, Sachiko; Miyazaki, Norihiko; Onodera, Toshio; Suzuki, Hiroyuki; Yoshioka, Nobuyuki; Arima, Sumika; Kadota, Kazuya
2009-04-01
The category and objective of DFM production management are shown. DFM is not limited to an activity within a particular unit process in design and process. A new framework for DFM is required. DFM should be a total solution for the common problems of all processes. Each of them must be linked to one another organically. After passing through the whole of each process on the manufacturing platform, quality of final products is guaranteed and products are shipped to the market. The information platform is layered with DFM, APC, and AEC. Advanced DFM is not DFM for partial optimization of the lithography process and the design, etc. and it should be Organized DFM. They are managed with high-level organizational IQ. The interim quality between each step of the flow should be visualized. DFM will be quality engineering if it is Organized DFM and common metrics of the quality are provided. DFM becomes quality engineering through effective implementation of common industrial metrics and standardized technology. DFM is differential technology, but can leverage standards for efficient development.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Financial Assistance for NMVC Companies (Leverage) Funding Leverage by Use of Sba Guaranteed Trust... standing in respect to compliance with the financial, ethical, and reporting requirements of such body...
Culturally Aware Agents for Training Environments (CAATE): Phase I Final Report
2009-01-01
attitudes, relationships , personality, personal traits, state, social roles, physical context. The initial set of potentially important cultural... relationships that affect culturally situated behavior. For instance, we will want to be able to model interconnections such as familial... relationships , group membership, and attitudes (e.g., trust, dislike). To accomplish this, our design leverages social network modeling technologies provided by
ERIC Educational Resources Information Center
Borkowski, Ellen Yu; Henry, David; Larsen, Lida L.; Mateik, Deborah
This paper describes a four-tiered approach to supporting University of Maryland faculty in the development of instructional materials to be delivered via the World Wide Web. The approach leverages existing equipment and staff by the design of Web posting, editing, and management tools for use on the campus-wide information server,…
ERIC Educational Resources Information Center
Ferati, Mexhid Adem
2012-01-01
To access interactive systems, blind and visually impaired users can leverage their auditory senses by using non-speech sounds. The current structure of non-speech sounds, however, is geared toward conveying user interface operations (e.g., opening a file) rather than large theme-based information (e.g., a history passage) and, thus, is ill-suited…
Fighting Tomorrows Fire Today: Leveraging Intelligence for Scenario-Based Exercise Design
2014-03-01
Management and Budget, Paperwork Reduction Project (0704–0188) Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE March 2014 3...IPG Improvised Projected Grenade IT information technology LLIS Lessons Learned Information Sharing MEP Master Exercise Practitioner MOU...Disaster?” Natural Hazards 18, no. 1 (1998): 87–88. xvii THIS PAGE INTENTIONALLY LEFT BLANK xviii
ERIC Educational Resources Information Center
Mazur, Amber D.; Brown, Barbara; Jacobsen, Michele
2015-01-01
The flipped classroom is an instructional model that leverages technology-enhanced instruction outside of class time in order to maximize student engagement and learning during class time. As part of an action research study, the authors synthesize reflections about how the flipped classroom model can support teaching, learning and assessment…
ERIC Educational Resources Information Center
Kim, Seon-Joo
2017-01-01
CAMPUS Asia (Collective Action for Mobility Program of University Students in Asia) is a student-exchange program designed to promote student mobility between South Korea, China, and Japan. Begun in 2011, the program aims to foster the next generation of leaders in Asia by nurturing young talents with shared visions. This article provides an…
ERIC Educational Resources Information Center
Smythe-Leistico, Kenneth; Page, Lindsay C.
2018-01-01
Poor school attendance in the early grades is predictive of poor subsequent educational outcomes. We report on a pilot intervention aiming to reduce chronic absenteeism in kindergarten. We designed and implemented a two-way, text-based parent-school communication system to encourage daily attendance, provide parents with personalized feedback on…
ERIC Educational Resources Information Center
Impelluso, Thomas J.
2009-01-01
Cognitive Load Theory (CLT) was used as a foundation to redesign a computer programming class for mechanical engineers, in which content was delivered with hybrid/distance technology. The effort confirmed the utility of CLT in course design. And it demonstrates that hybrid/distance learning is not merely a tool of convenience, but one, which, when…
The Potential of Systems Thinking in Teacher Reform as Theorized for the Teaching Brain Framework
ERIC Educational Resources Information Center
Rodriguez, Vanessa
2013-01-01
The teaching brain is a dynamic system that is in constant interaction with the learning brain. If we fail to explore the teaching brain we will continue to design educational reform policies that ignore the most important lens in the classroom: the teachers'. Master teachers recognize their perspective and leverage their teaching brains to embody…
ERIC Educational Resources Information Center
Stamas, Paul J.
2013-01-01
This case study research followed the two-year transition of a medium-sized manufacturing firm towards a service-oriented enterprise. A service-oriented enterprise is an emerging architecture of the firm that leverages the paradigm of services computing to integrate the capabilities of the firm with the complementary competencies of business…
RT 24 - Architecture, Modeling & Simulation, and Software Design
2010-11-01
focus on tool extensions (UPDM, SysML, SoaML, BPMN ) Leverage “best of breed” architecture methodologies Provide tooling to support the methodology DoDAF...Capability 10 Example: BPMN 11 DoDAF 2.0 MetaModel BPMN MetaModel Mapping SysML to DoDAF 2.0 12 DoDAF V2.0 Models OV-2 SysML Diagrams Requirement
Kukafka, Rita; Khan, Sharib A.; Hutchinson, Carly; McFarlane, Delano J.; Li, Jianhua; Ancker, Jessica S.; Cohall, Alwyn
2007-01-01
We describe the steps taken by the Harlem Health Promotion Center to develop a community-specific health web portal aimed at promoting health and well-being in Harlem. Methods and results that begin with data collection and move onto elucidating requirements for the web portal are discussed. Sentiments of distrust in medical institutions, and the desire for community specific content and resources were among the needs emanating from our data analysis. These findings guided our decision to customize social software designed to foster connections, collaborations, flexibility, and interactivity; an “architecture of participation”. While we maintain that the leveraging of social software may indeed be the way to build healthy communities and support learning and engagement in underserved communities, our conclusion calls for careful thinking, testing and evaluation research to establish best practice models for leveraging these emerging technologies to support health improvements in the community. PMID:18693872
Moller, Arlen C.; Merchant, Gina; Conroy, David E.; West, Robert; Hekler, Eric B.; Kugler, Kari C.; Michie, Susan
2017-01-01
As more behavioral health interventions move from traditional to digital platforms, the application of evidence-based theories and techniques may be doubly advantageous. First, it can expedite digital health intervention development, improving efficacy, and increasing reach. Second, moving behavioral health interventions to digital platforms presents researchers with novel (potentially paradigm shifting) opportunities for advancing theories and techniques. In particular, the potential for technology to revolutionize theory refinement is made possible by leveraging the proliferation of “real-time” objective measurement and “big data” commonly generated and stored by digital platforms. Much more could be done to realize this potential. This paper offers proposals for better leveraging the potential advantages of digital health platforms, and reviews three of the cutting edge methods for doing so: optimization designs, dynamic systems modeling, and social network analysis. PMID:28058516
Feature Selection for Ridge Regression with Provable Guarantees.
Paul, Saurabh; Drineas, Petros
2016-04-01
We introduce single-set spectral sparsification as a deterministic sampling-based feature selection technique for regularized least-squares classification, which is the classification analog to ridge regression. The method is unsupervised and gives worst-case guarantees of the generalization power of the classification function after feature selection with respect to the classification function obtained using all features. We also introduce leverage-score sampling as an unsupervised randomized feature selection method for ridge regression. We provide risk bounds for both single-set spectral sparsification and leverage-score sampling on ridge regression in the fixed design setting and show that the risk in the sampled space is comparable to the risk in the full-feature space. We perform experiments on synthetic and real-world data sets; a subset of TechTC-300 data sets, to support our theory. Experimental results indicate that the proposed methods perform better than the existing feature selection methods.
Designing informed game-based rehabilitation tasks leveraging advances in virtual reality.
Lange, Belinda; Koenig, Sebastian; Chang, Chien-Yen; McConnell, Eric; Suma, Evan; Bolas, Mark; Rizzo, Albert
2012-01-01
This paper details a brief history and rationale for the use of virtual reality (VR) technology for clinical research and intervention, and then focuses on game-based VR applications in the area of rehabilitation. An analysis of the match between rehabilitation task requirements and the assets available with VR technology is presented. Low-cost camera-based systems capable of tracking user behavior at sufficient levels for game-based virtual rehabilitation activities are currently available for in-home use. Authoring software is now being developed that aims to provide clinicians with a usable toolkit for leveraging this technology. This will facilitate informed professional input on software design, development and application to ensure safe and effective use in the rehabilitation context. The field of rehabilitation generally stands to benefit from the continual advances in VR technology, concomitant system cost reductions and an expanding clinical research literature and knowledge base. Home-based activity within VR systems that are low-cost, easy to deploy and maintain, and meet the requirements for "good" interactive rehabilitation tasks could radically improve users' access to care, adherence to prescribed training and subsequently enhance functional activity in everyday life in clinical populations.
Strategic defense initiative: critical issues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nuckolls, J.H.
The objectives of the Strategic Defense Initiative (SDI) as outlined by President Reagan are discussed. The principal objective for SDI is as a defense against ballistic missiles. Soviet objections and a summary of US-USSR dialogue on the subject are reviewed. Most US studies have been critical of SDI. Four critical issues are addressed in depth: are defense weapons technologically feasible which have high economic leverage relative to offensive ballistic missiles; would the defense feasibility and leverage be degraded or enhanced in the technological race between weapons innovation and countermeasures; could stability be achieved during and after the transition to themore » defense dominated world envisioned by SDI proponents; would the deployment of high leverage defensive weapons increase or decrease the security of NATO Europe, and the probability of major conventional or nuclear wars. The issue of SDI may lead to a paradox that contains the seeds of catastrophe. The author concludes by warning that nuclear disarmament may eliminate the highly successful deterrent mechanism for avoiding another major world war. In a world made safe for major conventional wars by the apparent ''elimination'' of nuclear weapons, the leaders in a conventional World War III - involving unimaginable suffering, hatred, terror, and death - would be strongly motivated to introduce nuclear weapons in the crucial decisive battles. Even if diplomacy could ''eliminate'' nuclear weapons, man's knowledge of nuclear weapons can never be eliminated. The paradox is the attempt to eliminate nuclear weapons may maximize the probability of their use. (DMC)« less
78 FR 17766 - Interagency Guidance on Leveraged Lending
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-22
..., Board, and FDIC that engage in leveraged lending activities. The number of community banks with substantial involvement in leveraged lending is small; therefore, the agencies generally expect community... potential impact of stressful events and circumstances on borrowers' financial condition. Recent financial...
Advances in high-power 9XXnm laser diodes for pumping fiber lasers
NASA Astrophysics Data System (ADS)
Skidmore, Jay; Peters, Matthew; Rossin, Victor; Guo, James; Xiao, Yan; Cheng, Jane; Shieh, Allen; Srinivasan, Raman; Singh, Jaspreet; Wei, Cailin; Duesterberg, Richard; Morehead, James J.; Zucker, Erik
2016-03-01
A multi-mode 9XXnm-wavelength laser diode was developed to optimize the divergence angle and reliable ex-facet power. Lasers diodes were assembled into a multi-emitter pump package that is fiber coupled via spatial and polarization multiplexing. The pump package has a 135μm diameter output fiber that leverages the same optical train and mechanical design qualified previously. Up to ~ 270W CW power at 22A is achieved at a case temperature ~ 30ºC. Power conversion efficiency is 60% (peak) that drops to 53% at 22A with little thermal roll over. Greater than 90% of the light is collected at < 0.12NA at 16A drive current that produces 3.0W/(mm-mr)2 radiance from the output fiber.
Seq-ing answers: uncovering the unexpected in global gene regulation.
Otto, George Maxwell; Brar, Gloria Ann
2018-04-19
The development of techniques for measuring gene expression globally has greatly expanded our understanding of gene regulatory mechanisms in depth and scale. We can now quantify every intermediate and transition in the canonical pathway of gene expression-from DNA to mRNA to protein-genome-wide. Employing such measurements in parallel can produce rich datasets, but extracting the most information requires careful experimental design and analysis. Here, we argue for the value of genome-wide studies that measure multiple outputs of gene expression over many timepoints during the course of a natural developmental process. We discuss our findings from a highly parallel gene expression dataset of meiotic differentiation, and those of others, to illustrate how leveraging these features can provide new and surprising insight into fundamental mechanisms of gene regulation.
Liquid on Paper: Rapid Prototyping of Soft Functional Components for Paper Electronics.
Han, Yu Long; Liu, Hao; Ouyang, Cheng; Lu, Tian Jian; Xu, Feng
2015-07-01
This paper describes a novel approach to fabricate paper-based electric circuits consisting of a paper matrix embedded with three-dimensional (3D) microchannels and liquid metal. Leveraging the high electric conductivity and good flowability of liquid metal, and metallophobic property of paper, it is possible to keep electric and mechanical functionality of the electric circuit even after a thousand cycles of deformation. Embedding liquid metal into paper matrix is a promising method to rapidly fabricate low-cost, disposable, and soft electric circuits for electronics. As a demonstration, we designed a programmable displacement transducer and applied it as variable resistors and pressure sensors. The unique metallophobic property, combined with softness, low cost and light weight, makes paper an attractive alternative to other materials in which liquid metal are currently embedded.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gardner, S; Jaing, C
The goal of this project is to develop forensic genotyping assays for select agent viruses, addressing a significant capability gap for the viral bioforensics and law enforcement community. We used a multipronged approach combining bioinformatics analysis, PCR-enriched samples, microarrays and TaqMan assays to develop high resolution and cost effective genotyping methods for strain level forensic discrimination of viruses. We have leveraged substantial experience and efficiency gained through year 1 on software development, SNP discovery, TaqMan signature design and phylogenetic signature mapping to scale up the development of forensics signatures in year 2. In this report, we have summarized the Taqmanmore » signature development for South American hemorrhagic fever viruses, tick-borne encephalitis viruses and henipaviruses, Old World Arenaviruses, filoviruses, Crimean-Congo hemorrhagic fever virus, Rift Valley fever virus and Japanese encephalitis virus.« less
Guthrie, Kate M; Rosen, Rochelle K; Vargas, Sara E; Guillen, Melissa; Steger, Arielle L; Getz, Melissa L; Smith, Kelley A; Ramirez, Jaime J; Kojic, Erna M
2017-10-01
The development of HIV-preventive topical vaginal microbicides has been challenged by a lack of sufficient adherence in later stage clinical trials to confidently evaluate effectiveness. This dilemma has highlighted the need to integrate translational research earlier in the drug development process, essentially applying behavioral science to facilitate the advances of basic science with respect to the uptake and use of biomedical prevention technologies. In the last several years, there has been an increasing recognition that the user experience, specifically the sensory experience, as well as the role of meaning-making elicited by those sensations, may play a more substantive role than previously thought. Importantly, the role of the user-their sensory perceptions, their judgements of those experiences, and their willingness to use a product-is critical in product uptake and consistent use post-marketing, ultimately realizing gains in global public health. Specifically, a successful prevention product requires an efficacious drug, an efficient drug delivery system, and an effective user. We present an integrated iterative drug development and user experience evaluation method to illustrate how user-centered formulation design can be iterated from the early stages of preclinical development to leverage the user experience. Integrating the user and their product experiences into the formulation design process may help optimize both the efficiency of drug delivery and the effectiveness of the user.
Greased Lightning (GL-10) Flight Testing Campaign
NASA Technical Reports Server (NTRS)
Fredericks, William J.; McSwain, Robert G.; Beaton, Brian F.; Klassman, David W.; Theodore, Colin R.
2017-01-01
Greased Lightning (GL-10) is an aircraft configuration that combines the characteristics of a cruise efficient airplane with the ability to perform vertical takeoff and landing (VTOL). This aircraft has been designed, fabricated and flight tested at the small unmanned aerial system (UAS) scale. This technical memorandum will document the procedures and findings of the flight test experiments. The GL-10 design utilized two key technologies to enable this unique aircraft design; namely, distributed electric propulsion (DEP) and inexpensive closed loop controllers. These technologies enabled the flight of this inherently unstable aircraft. Overall it has been determined thru flight test that a design that leverages these new technologies can yield a useful VTOL cruise efficient aircraft.
BIOINFORMATICS IN THE K-8 CLASSROOM: DESIGNING INNOVATIVE ACTIVITIES FOR TEACHER IMPLEMENTATION
Shuster, Michele; Claussen, Kira; Locke, Melly; Glazewski, Krista
2016-01-01
At the intersection of biology and computer science is the growing field of bioinformatics—the analysis of complex datasets of biological relevance. Despite the increasing importance of bioinformatics and associated practical applications, these are not standard topics in elementary and middle school classrooms. We report on a pilot project and its evolution to support implementation of bioinformatics-based activities in elementary and middle school classrooms. Specifically, we ultimately designed a multi-day summer teacher professional development workshop, in which teachers design innovative classroom activities. By focusing on teachers, our design leverages enhanced teacher knowledge and confidence to integrate innovative instructional materials into K-8 classrooms and contributes to capacity building in STEM instruction. PMID:27429860
Genetic Constructor: An Online DNA Design Platform.
Bates, Maxwell; Lachoff, Joe; Meech, Duncan; Zulkower, Valentin; Moisy, Anaïs; Luo, Yisha; Tekotte, Hille; Franziska Scheitz, Cornelia Johanna; Khilari, Rupal; Mazzoldi, Florencio; Chandran, Deepak; Groban, Eli
2017-12-15
Genetic Constructor is a cloud Computer Aided Design (CAD) application developed to support synthetic biologists from design intent through DNA fabrication and experiment iteration. The platform allows users to design, manage, and navigate complex DNA constructs and libraries, using a new visual language that focuses on functional parts abstracted from sequence. Features like combinatorial libraries and automated primer design allow the user to separate design from construction by focusing on functional intent, and design constraints aid iterative refinement of designs. A plugin architecture enables contributions from scientists and coders to leverage existing powerful software and connect to DNA foundries. The software is easily accessible and platform agnostic, free for academics, and available in an open-source community edition. Genetic Constructor seeks to democratize DNA design, manufacture, and access to tools and services from the synthetic biology community.
Improving Feedback through Online Professional Development
ERIC Educational Resources Information Center
Klein, Valerie; Fukawa-Connelly, Timothy; Silverman, Jason
2016-01-01
Teachers' focus on student thinking--and moving beyond superficial "right or wrong" analysis--is essential to effective teaching (Grossman and McDonald 2008; NCTM 2000; Son and Sinclair 2010). Interpreting and evaluating student thinking and providing high-quality feedback are seen as high-leverage practices and are "likely to lead…
Pennsylvania Pre-K Counts: End of Year Report, 2009-2010
ERIC Educational Resources Information Center
Pennsylvania Department of Education, 2010
2010-01-01
Pennsylvania Pre-K Counts was created to provide research-based, high quality pre-kindergarten opportunities to at-risk children across the commonwealth by leveraging the existing early education services in schools, Keystone STARS child care programs, Head Start, and licensed nursery schools. The standards are high and the accountability…
Helping Soldiers Leverage Army Knowledge, Skills, and Abilities in Civilian Jobs
2017-01-01
44,100 High school diploma 4 Wind turbine service technicians 3,710 $48,800 High school diploma 4 Table D.1—Continued 139 Lists of Figures and Tables...servicing automotive electrical systems, including wiring harnesses and starting and charging systems; and Table 3.40 Civilian Occupations Recommended
From Binary Notation to Gravitational Waves: Rocket Science Made Easy
NASA Technical Reports Server (NTRS)
Fisher, Diane K.; Leon, Nancy J.; Cooper, Larry
2001-01-01
The Space Place is a NASA educational outreach program open to all NASA missions, studies, and instruments. It uses diverse media (web, print, displays, hands-on activities) to deliver high-quality products through a highly leveraged infrastructure. Additional information is contained in the original extended abstract.
Through Technology Leverage and Risk Avoidance
ERIC Educational Resources Information Center
Sugasawa, Yoshio; Shinomiya, Takeshi
2010-01-01
Companies make concerted efforts to survive in a radically changing global society with the advent of a highly-networked and information-rich society that is featured by intense market competition. Manufacturing industries in particular have a tendency to rely on technological development strengths as a means of survival in a highly globalised and…
77 FR 19417 - Proposed Guidance on Leveraged Lending
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-30
... engaged in leveraged lending activities. The number of community banking organizations with substantial exposure to leveraged lending is very small; therefore the Agencies generally expect that community banking... that they understand their risks and the potential impact of stressful events and circumstances on...
Pretel, R; Shoener, B D; Ferrer, J; Guest, J S
2015-12-15
Anaerobic membrane bioreactors (AnMBRs) enable energy recovery from wastewater while simultaneously achieving high levels of treatment. The objective of this study was to elucidate how detailed design and operational decisions of submerged AnMBRs influence the technological, environmental, and economic sustainability of the system across its life cycle. Specific design and operational decisions evaluated included: solids retention time (SRT), mixed liquor suspended solids (MLSS) concentration, sludge recycling ratio (r), flux (J), and specific gas demand per membrane area (SGD). The possibility of methane recovery (both as biogas and as soluble methane in reactor effluent) and bioenergy production, nutrient recovery, and final destination of the sludge (land application, landfill, or incineration) were also evaluated. The implications of these design and operational decisions were characterized by leveraging a quantitative sustainable design (QSD) framework which integrated steady-state performance modeling across seasonal temperatures (using pilot-scale experimental data and the simulating software DESASS), life cycle cost (LCC) analysis, and life cycle assessment (LCA). Sensitivity and uncertainty analyses were used to characterize the relative importance of individual design decisions, and to navigate trade-offs across environmental, economic, and technological criteria. Based on this analysis, there are design and operational conditions under which submerged AnMBRs could be net energy positive and contribute to the pursuit of carbon negative wastewater treatment. Copyright © 2015 Elsevier Ltd. All rights reserved.
Bikson, Marom; Rahman, Asif; Datta, Abhishek; Fregni, Felipe; Merabet, Lotfi
2012-01-01
Objectives Transcranial direct current stimulation (tDCS) is a neuromodulatory technique that delivers low-intensity currents facilitating or inhibiting spontaneous neuronal activity. tDCS is attractive since dose is readily adjustable by simply changing electrode number, position, size, shape, and current. In the recent past, computational models have been developed with increased precision with the goal to help customize tDCS dose. The aim of this review is to discuss the incorporation of high-resolution patient-specific computer modeling to guide and optimize tDCS. Methods In this review, we discuss the following topics: (i) The clinical motivation and rationale for models of transcranial stimulation is considered pivotal in order to leverage the flexibility of neuromodulation; (ii) The protocols and the workflow for developing high-resolution models; (iii) The technical challenges and limitations of interpreting modeling predictions, and (iv) Real cases merging modeling and clinical data illustrating the impact of computational models on the rational design of rehabilitative electrotherapy. Conclusions Though modeling for non-invasive brain stimulation is still in its development phase, it is predicted that with increased validation, dissemination, simplification and democratization of modeling tools, computational forward models of neuromodulation will become useful tools to guide the optimization of clinical electrotherapy. PMID:22780230
NASA Astrophysics Data System (ADS)
Wendel, Christopher H.; Gao, Zhan; Barnett, Scott A.; Braun, Robert J.
2015-06-01
Electrical energy storage is expected to be a critical component of the future world energy system, performing load-leveling operations to enable increased penetration of renewable and distributed generation. Reversible solid oxide cells, operating sequentially between power-producing fuel cell mode and fuel-producing electrolysis mode, have the capability to provide highly efficient, scalable electricity storage. However, challenges ranging from cell performance and durability to system integration must be addressed before widespread adoption. One central challenge of the system design is establishing effective thermal management in the two distinct operating modes. This work leverages an operating strategy to use carbonaceous reactant species and operate at intermediate stack temperature (650 °C) to promote exothermic fuel-synthesis reactions that thermally self-sustain the electrolysis process. We present performance of a doped lanthanum-gallate (LSGM) electrolyte solid oxide cell that shows high efficiency in both operating modes at 650 °C. A physically based electrochemical model is calibrated to represent the cell performance and used to simulate roundtrip operation for conditions unique to these reversible systems. Design decisions related to system operation are evaluated using the cell model including current density, fuel and oxidant reactant compositions, and flow configuration. The analysis reveals tradeoffs between electrical efficiency, thermal management, energy density, and durability.
Estimation of the Continuous and Discontinuous Leverage Effects
Aït-Sahalia, Yacine; Fan, Jianqing; Laeven, Roger J. A.; Wang, Christina Dan; Yang, Xiye
2017-01-01
This paper examines the leverage effect, or the generally negative covariation between asset returns and their changes in volatility, under a general setup that allows the log-price and volatility processes to be Itô semimartingales. We decompose the leverage effect into continuous and discontinuous parts and develop statistical methods to estimate them. We establish the asymptotic properties of these estimators. We also extend our methods and results (for the continuous leverage) to the situation where there is market microstructure noise in the observed returns. We show in Monte Carlo simulations that our estimators have good finite sample performance. When applying our methods to real data, our empirical results provide convincing evidence of the presence of the two leverage effects, especially the discontinuous one. PMID:29606780
Estimation of the Continuous and Discontinuous Leverage Effects.
Aït-Sahalia, Yacine; Fan, Jianqing; Laeven, Roger J A; Wang, Christina Dan; Yang, Xiye
2017-01-01
This paper examines the leverage effect, or the generally negative covariation between asset returns and their changes in volatility, under a general setup that allows the log-price and volatility processes to be Itô semimartingales. We decompose the leverage effect into continuous and discontinuous parts and develop statistical methods to estimate them. We establish the asymptotic properties of these estimators. We also extend our methods and results (for the continuous leverage) to the situation where there is market microstructure noise in the observed returns. We show in Monte Carlo simulations that our estimators have good finite sample performance. When applying our methods to real data, our empirical results provide convincing evidence of the presence of the two leverage effects, especially the discontinuous one.
NASA Technical Reports Server (NTRS)
1984-01-01
High leverage technologies are examined for application to the space station. The areas under investigation include attitude control, data management, long life thermal management, and automated housekeeping integration.
13 CFR 108.1120 - General eligibility requirement for Leverage.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false General eligibility requirement for Leverage. 108.1120 Section 108.1120 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW MARKETS VENTURE CAPITAL (âNMVCâ) PROGRAM SBA Financial Assistance for NMVC Companies (Leverage...
A study about the existence of the leverage effect in stochastic volatility models
NASA Astrophysics Data System (ADS)
Florescu, Ionuţ; Pãsãricã, Cristian Gabriel
2009-02-01
The empirical relationship between the return of an asset and the volatility of the asset has been well documented in the financial literature. Named the leverage effect or sometimes risk-premium effect, it is observed in real data that, when the return of the asset decreases, the volatility increases and vice versa. Consequently, it is important to demonstrate that any formulated model for the asset price is capable of generating this effect observed in practice. Furthermore, we need to understand the conditions on the parameters present in the model that guarantee the apparition of the leverage effect. In this paper we analyze two general specifications of stochastic volatility models and their capability of generating the perceived leverage effect. We derive conditions for the apparition of leverage effect in both of these stochastic volatility models. We exemplify using stochastic volatility models used in practice and we explicitly state the conditions for the existence of the leverage effect in these examples.
The Leverage Effect on Wealth Distribution in a Controllable Laboratory Stock Market
Zhu, Chenge; Yang, Guang; An, Kenan; Huang, Jiping
2014-01-01
Wealth distribution has always been an important issue in our economic and social life, since it affects the harmony and stabilization of the society. Under the background of widely used financial tools to raise leverage these years, we studied the leverage effect on wealth distribution of a population in a controllable laboratory market in which we have conducted several human experiments, and drawn the conclusion that higher leverage leads to a higher Gini coefficient in the market. A higher Gini coefficient means the wealth distribution among a population becomes more unequal. This is a result of the ascending risk with growing leverage level in the market plus the diversified trading abilities and risk preference of the participants. This work sheds light on the effects of leverage and its related regulations, especially its impact on wealth distribution. It also shows the capability of the method of controllable laboratory markets which could be helpful in several fields of study such as economics, econophysics and sociology. PMID:24968222
The leverage effect on wealth distribution in a controllable laboratory stock market.
Zhu, Chenge; Yang, Guang; An, Kenan; Huang, Jiping
2014-01-01
Wealth distribution has always been an important issue in our economic and social life, since it affects the harmony and stabilization of the society. Under the background of widely used financial tools to raise leverage these years, we studied the leverage effect on wealth distribution of a population in a controllable laboratory market in which we have conducted several human experiments, and drawn the conclusion that higher leverage leads to a higher Gini coefficient in the market. A higher Gini coefficient means the wealth distribution among a population becomes more unequal. This is a result of the ascending risk with growing leverage level in the market plus the diversified trading abilities and risk preference of the participants. This work sheds light on the effects of leverage and its related regulations, especially its impact on wealth distribution. It also shows the capability of the method of controllable laboratory markets which could be helpful in several fields of study such as economics, econophysics and sociology.
Preliminary Design Considerations for Access and Operations in Earth-Moon L1/L2 Orbits
NASA Technical Reports Server (NTRS)
Folta, David C.; Pavlak, Thomas A.; Haapala, Amanda F.; Howell, Kathleen C.
2013-01-01
Within the context of manned spaceflight activities, Earth-Moon libration point orbits could support lunar surface operations and serve as staging areas for future missions to near-Earth asteroids and Mars. This investigation examines preliminary design considerations including Earth-Moon L1/L2 libration point orbit selection, transfers, and stationkeeping costs associated with maintaining a spacecraft in the vicinity of L1 or L2 for a specified duration. Existing tools in multi-body trajectory design, dynamical systems theory, and orbit maintenance are leveraged in this analysis to explore end-to-end concepts for manned missions to Earth-Moon libration points.
NASA Astrophysics Data System (ADS)
Garsha, Karl E.
2004-06-01
There is an increasing amount of interest in functionalized microstructural, microphotonic and microelectromechanical systems (MEMS) for use in biological applications. By scanning a tightly focused ultra-short pulsed laser beam inside a wide variety of commercially available polymer systems, the flexibility of the multiphoton microscope can be extended to include routine manufacturing of micro-devices with feature sizes well below the diffraction limit. Compared with lithography, two-photon polymerization has the unique ability to additively realize designs with high resolution in three dimensions; this permits the construction of cross-linked components and structures with hollow cavities. In light of the increasing availability of multiphoton imaging systems at research facilities, femtosecond laser manufacturing becomes particularly attractive in that the modality provides a readily accessible, rapid and high-accuracy 3-D processing capability to biological investigators interested in culture scaffolds and biomimetic tissue engineering, bio-MEMS, biomicrophotonics and microfluidics applications. This manuscript overviews recent efforts towards to enabling user accessible 3-D micro-manufacturing capabilities on a conventional proprietary-based imaging system. Software which permits the off-line design of microstructures and leverages the extensibility of proprietary LCSM image acquisition software to realize designs is introduced. The requirements for multiphoton photo-disruption (ablation) are in some ways analogous to those for multiphoton polymerization. Hence, "beam-steering" also facilitates precision photo-disruption of biological tissues with 3-D resolution, and applications involving tissue microdissection and intracellular microsurgery or three-dimensionally resolved fluorescence recovery after photobleaching (FRAP) studies can benefit from this work as well.
The perceived value of using BIM for energy simulation
NASA Astrophysics Data System (ADS)
Lewis, Anderson M.
Building Information Modeling (BIM) is becoming an increasingly important tool in the Architectural, Engineering & Construction (AEC) industries. Some of the benefits associated with BIM include but are not limited to cost and time savings through greater trade and design coordination, and more accurate estimating take-offs. BIM is a virtual 3D, parametric design software that allows users to store information of a model within and can be used as a communication platform between project stakeholders. Likewise, energy simulation is an integral tool for predicting and optimizing a building's performance during design. Creating energy models and running energy simulations can be a time consuming activity due to the large number of parameters and assumptions that must be addressed to achieve reasonably accurate results. However, leveraging information imbedded within Building Information Models (BIMs) has the potential to increase accuracy and reduce the amount of time required to run energy simulations and can facilitate continuous energy simulations throughout the design process, thus optimizing building performance. Although some literature exists on how design stakeholders perceive the benefits associated with leveraging BIM for energy simulation, little is known about how perceptions associated with leveraging BIM for energy simulation differ between various green design stakeholder user groups. Through an e-survey instrument, this study seeks to determine how perceptions of using BIMs to inform energy simulation differ among distinct design stakeholder groups, which include BIM-only users, energy simulation-only users and BIM and energy simulation users. Additionally, this study seeks to determine what design stakeholders perceive as the main barriers and benefits of implementing BIM-based energy simulation. Results from this study suggest that little to no correlation exists between green design stakeholders' perceptions of the value associated with using information from BIMs to inform energy simulation and their engagement level with BIM and/or energy simulation. However, green design stakeholder perceptions of the value associated with using information from BIMs to inform energy simulation and their engagement with BIM and/or energy simulation may differ between different user groups (i.e. BIM users only, energy simulation users only, and BIM and energy simulation users). For example, the BIM-only user groups appeared to have a strong positive correlation between the perceptions of the value associated with using information from BIMs to inform energy simulation and their engagement with BIM. Additionally, this study suggests that the top perceived benefits of using BIMs to inform energy simulations among green design stakeholders are: facilitation of communication, reducing of process related costs, and giving users the ability examine more design options. The main perceived barrier of using BIMs to inform energy simulations among green design stakeholders was a lack of BIM standards for model integration with multidisciplinary teams. Results from this study will help readers understand how to better implement BIM-based energy simulation while mitigating barriers and optimizing benefits. Additionally, examining discrepancies between user groups can lead the identification and improvement of shortfalls in current BIM-based energy simulation processes. Understanding how perceptions and engagement levels differ among different software user groups will help in developing a strategies for implementing BIM-based energy simulation that are tailored to each specific user group.
Industrial knowledge design: an approach for designing information artifacts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schatz, Sae; Berking, Peter; Raybourn, Elaine M.
In this study, the authors define a new approach that addresses the challenge of efficiently designing informational artefacts for optimal knowledge acquisition, an important issue in cognitive ergonomics. Termed Industrial Knowledge Design (or InK'D), it draws from information-related (e.g. informatics) and neurosciences-related (e.g. neuroergonomics) disciplines. Although it can be used for a broad scope of communication-driven business functions, our focus as learning professionals is on conveying knowledge for purposes of training, education, and performance support. This paper discusses preliminary principles of InK'D practice that can be employed to maximise the quality and quantity of transferred knowledge through interaction design. Themore » paper codifies tacit knowledge into explicit concepts that can be leveraged by expert and non-expert knowledge designers alike.« less
Industrial knowledge design: an approach for designing information artifacts
Schatz, Sae; Berking, Peter; Raybourn, Elaine M.
2017-01-19
In this study, the authors define a new approach that addresses the challenge of efficiently designing informational artefacts for optimal knowledge acquisition, an important issue in cognitive ergonomics. Termed Industrial Knowledge Design (or InK'D), it draws from information-related (e.g. informatics) and neurosciences-related (e.g. neuroergonomics) disciplines. Although it can be used for a broad scope of communication-driven business functions, our focus as learning professionals is on conveying knowledge for purposes of training, education, and performance support. This paper discusses preliminary principles of InK'D practice that can be employed to maximise the quality and quantity of transferred knowledge through interaction design. Themore » paper codifies tacit knowledge into explicit concepts that can be leveraged by expert and non-expert knowledge designers alike.« less
High Technology Service Value Maximization through an MCDM-Based Innovative e-Business Model
NASA Astrophysics Data System (ADS)
Huang, Chi-Yo; Tzeng, Gwo-Hshiung; Ho, Wen-Rong; Chuang, Hsiu-Tyan; Lue, Yeou-Feng
The emergence of the Internet has changed the high technology marketing channels thoroughly in the past decade while E-commerce has already become one of the most efficient channels which high technology firms may skip the intermediaries and reach end customers directly. However, defining appropriate e-business models for commercializing new high technology products or services through Internet are not that easy. To overcome the above mentioned problems, a novel analytic framework based on the concept of high technology customers’ competence set expansion by leveraging high technology service firms’ capabilities and resources as well as novel multiple criteria decision making (MCDM) techniques, will be proposed in order to define an appropriate e-business model. An empirical example study of a silicon intellectual property (SIP) commercialization e-business model based on MCDM techniques will be provided for verifying the effectiveness of this novel analytic framework. The analysis successful assisted a Taiwanese IC design service firm to define an e-business model for maximizing its customer’s SIP transactions. In the future, the novel MCDM framework can be applied successful to novel business model definitions in the high technology industry.
13 CFR 107.1100 - Types of Leverage and application procedures.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Types of Leverage and application procedures. 107.1100 Section 107.1100 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION SMALL BUSINESS INVESTMENT COMPANIES SBA Financial Assistance for Licensees (Leverage) General Information About...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-13
... Programs Division, Business Operations, Federal Student Aid, U.S. Department of Education, 830 First Street... Operations, Federal Student Aid, U.S. Department of Education, 830 First Street, NE., room UCP-062E3... DEPARTMENT OF EDUCATION Federal Student Aid; Leveraging Educational Assistance Partnership...
New Mexico Small Business Assistance (NMSBA) September 2016 Advisory Council Update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larkin, Ariana Kayla
This is an update on two projects headed by Los Alamos National Laboratory and Sandia National Laboratories. The two projects are: The Electrochemical Based Gas Analyzer for Automotive Diagnostic and Maximizing the Production of High Value and High Demand Guar Gum on Marginal Lands in New Mexico. The Electrochemical Based Gas Analyzer for Automotive Diagnostic NMSBA leveraged project is made up of Albuquerque companies, Automotive Test Solutions, Inc. (ATS), ATS Mobile Diagnostics and Thoma Technologies and Los Alamos small business, VI Control Systems, to develop a new sensor system for the automotive industry. The Guar Gum NMSBA Leveraged Project beganmore » in January 2016 with the goal to develop biotechnology to enable a genetic modification of prairie cordgrass, a renewable feedstock for bioenergy and bio-manufacturing. In the long term, the companies hope to use the technology to bio-manufacture high value products in the stem of the plant. This document describes the laboratories' cooperation with small businesses on these projects.« less
ERIC Educational Resources Information Center
Lasserre, Kaye E.; Moffatt, Jennifer J.
2013-01-01
The paper reports on a project where the objective was for the Rural Clinical School, The University of Queensland, Australia, to design an acceptable model of research skills workshops for medical students and rural health professionals. Eight, interactive research skills workshops focused on skill development were conducted in rural Queensland,…
ERIC Educational Resources Information Center
Martinez, Danny C.; Montaño, Elizabeth
2016-01-01
In this article, we report findings from a yearlong design research project that worked to leverage the language brokering skills of Latina/Latino middle school youth in an urban school setting. We began the project by asking seventh-grade students to talk about the many languages they speak in their daily lives. Throughout the project, their…
Moving Our Can(n)ons: Toward an Appreciation of Multimodal Texts in the Classroom
ERIC Educational Resources Information Center
Jiménez, Laura M.; Roberts, Kathryn L.; Brugar, Kristy A.; Meyer, Carla K.; Waito, Kim
2017-01-01
The growing popularity of graphic novels for younger readers is hard to miss. This article provides specific ways to think about, recognize, and teach with multimodal texts that leverage student interest. In this English language arts unit, we taught a sixth-grade class how to read and comprehend the complex design elements common to the graphic…
ERIC Educational Resources Information Center
Crawford, Evan
2017-01-01
Studies suggest that between one-fourth and one-third of localities elect their leaders on partisan ballots. Does the presence of a party label on the ballot affect the level of partisanship in local office? I leverage the fact that within select states, school boards vary as to whether their members are elected on partisan or nonpartisan ballots.…
geneLAB: Expanding the Impact of NASA's Biological Research in Space
NASA Technical Reports Server (NTRS)
Rayl, Nicole; Smith, Jeffrey D.
2014-01-01
The geneLAB project is designed to leverage the value of large 'omics' datasets from molecular biology projects conducted on the ISS by making these datasets available, citable, discoverable, interpretable, reusable, and reproducible. geneLAB will create a collaboration space with an integrated set of tools for depositing, accessing, analyzing, and modeling these diverse datasets from spaceflight and related terrestrial studies.
Considering IIOT and security for the DoD
NASA Astrophysics Data System (ADS)
Klawon, Kevin; Gold, Josh; Bachman, Kristen; Landoll, Darren
2016-05-01
The Internet of Things (IoT) has come of age and domestic and industrial devices are all "smart". But how can they be universally classified and queried? How do we know that the underlying architecture is secure enough to deploy on a defense network? By leverage existing platforms designed for interoperability, extensibility, and security that can manage data across multiple domains and runs on any platform.
Technical Assessment: Autonomy
2015-02-01
and video games . If DoD develops CONOPS for lower- performance systems, there is an opportunity to leverage a large amount of private investment, as the...originally designed for the Xbox video game platform, it is now being used or developed for retail environments, operating rooms, and physical therapy...approaches that render artificial intelligence less susceptible to intelligent influence. One area worthy of consideration is applied game theory, which
2015-06-01
version of the Bear operating system. The full system is depicted in Figure 3 and is composed of a minimalist micro-kernel with an associated...which are intended to support a general virtual machine execution environment, this minimalist hypervisor is designed to support only the operations...The use of a minimalist hypervisor in the Bear system opened the door to discovery of zero-day exploits. The approach leverages the hypervisors
Gase, Lauren N; Kuo, Tony; Coller, Karen; Guerrero, Lourdes R; Wong, Mitchell D
2014-09-01
We examined multiple variables influencing school truancy to identify potential leverage points to improve school attendance. A cross-sectional observational design was used to analyze inner-city data collected in Los Angeles County, California, during 2010 to 2011. We constructed an ordinal logistic regression model with cluster robust standard errors to examine the association between truancy and various covariates. The sample was predominantly Hispanic (84.3%). Multivariable analysis revealed greater truancy among students (1) with mild (adjusted odds ratio [AOR] = 1.57; 95% confidence interval [CI] = 1.22, 2.01) and severe (AOR = 1.80; 95% CI = 1.04, 3.13) depression (referent: no depression), (2) whose parents were neglectful (AOR = 2.21; 95% CI = 1.21, 4.03) or indulgent (AOR = 1.71; 95% CI = 1.04, 2.82; referent: authoritative parents), (3) who perceived less support from classes, teachers, and other students regarding college preparation (AOR = 0.87; 95% CI = 0.81, 0.95), (4) who had low grade point averages (AOR = 2.34; 95% CI = 1.49, 4.38), and (5) who reported using alcohol (AOR = 3.47; 95% CI = 2.34, 5.14) or marijuana (AOR = 1.59; 95% CI = 1.06, 2.38) during the past month. Study findings suggest depression, substance use, and parental engagement as potential leverage points for public health to intervene to improve school attendance.
Unified Photo Enhancement by Discovering Aesthetic Communities From Flickr.
Hong, Richang; Zhang, Luming; Tao, Dacheng
2016-03-01
Photo enhancement refers to the process of increasing the aesthetic appeal of a photo, such as changing the photo aspect ratio and spatial recomposition. It is a widely used technique in the printing industry, graphic design, and cinematography. In this paper, we propose a unified and socially aware photo enhancement framework which can leverage the experience of photographers with various aesthetic topics (e.g., portrait and landscape). We focus on photos from the image hosting site Flickr, which has 87 million users and to which more than 3.5 million photos are uploaded daily. First, a tagwise regularized topic model is proposed to describe the aesthetic topic of each Flickr user, and coherent and interpretable topics are discovered by leveraging both the visual features and tags of photos. Next, a graph is constructed to describe the similarities in aesthetic topics between the users. Noticeably, densely connected users have similar aesthetic topics, which are categorized into different communities by a dense subgraph mining algorithm. Finally, a probabilistic model is exploited to enhance the aesthetic attractiveness of a test photo by leveraging the photographic experiences of Flickr users from the corresponding communities of that photo. Paired-comparison-based user studies show that our method performs competitively on photo retargeting and recomposition. Moreover, our approach accurately detects aesthetic communities in a photo set crawled from nearly 100000 Flickr users.
17 CFR 31.15 - Reporting to leverage customers.
Code of Federal Regulations, 2010 CFR
2010-04-01
... short leverage contract. (3) The net ledger balance carried in the leverage customer's account as of the... customer; (4) A detailed accounting of all financial charges and credits to the previous ledger balance...-point type: IF YOU BELIEVE YOUR MONTHLY STATEMENT IS INACCURATE YOU SHOULD PROMPTLY CONTACT (name of LTM...
13 CFR 107.1130 - Leverage fees and additional charges payable by Licensee.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Leverage fees and additional charges payable by Licensee. 107.1130 Section 107.1130 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION SMALL BUSINESS INVESTMENT COMPANIES SBA Financial Assistance for Licensees (Leverage) General...
13 CFR 107.1000 - Licensees without Leverage-exceptions to the regulations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Licensees without Leverage-exceptions to the regulations. 107.1000 Section 107.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION SMALL BUSINESS INVESTMENT COMPANIES Non-leveraged Licensees-Exceptions to Regulations § 107.1000...
13 CFR 107.1150 - Maximum amount of Leverage for a Section 301(c) Licensee.
Code of Federal Regulations, 2010 CFR
2010-01-01
... conservative investment strategy that limits downside risk. Any such Leverage request must be supported by an up-to-date business plan that reflects continuation of the Licensee's successful investment strategy... ADMINISTRATION SMALL BUSINESS INVESTMENT COMPANIES SBA Financial Assistance for Licensees (Leverage) Maximum...
13 CFR 108.1130 - Leverage fees payable by NMVC Company.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Leverage fees payable by NMVC Company. 108.1130 Section 108.1130 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW MARKETS VENTURE CAPITAL (âNMVCâ) PROGRAM SBA Financial Assistance for NMVC Companies (Leverage) General...
76 FR 80217 - Rural Business Investment Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-23
... to the Secretary stating that the non-rated debt instrument is equivalent in risk to the issuer's... provisions for Rural Business Investment Companies (RBIC) that wish to participate in a non-leveraged... exclusions for the RBIP for both leveraged and non- leveraged RBICs. DATES: Effective date. This rule will...
17 CFR 31.6 - Registration of leverage commodities.
Code of Federal Regulations, 2011 CFR
2011-04-01
... commodity's economic value and how such amendments might affect the ability of leverage customers making or... a change in the economic value of such commodities and, if so, quantify the extent of such changes... the ability of leverage customers electing to make or take delivery of the commodity at an economic...
12 CFR 325.6 - Issuance of directives.
Code of Federal Regulations, 2010 CFR
2010-01-01
... is a final order issued to a bank that fails to maintain capital at or above the minimum leverage... operating with less than the minimum leverage capital requirement established by this regulation, the Board... directive requiring the bank to restore its capital to the minimum leverage capital requirement within a...
Leveraged resources and systems changes in community collaboration.
Harper, Christopher R; Kuperminc, Gabriel P; Weaver, Scott R; Emshoff, Jim; Erickson, Steve
2014-12-01
Most models of community collaboration emphasize the ability of diverse partners to come together to enact systematic changes that improve the health of individuals and communities. The ability of these groups to leverage resources is thought to be an important marker of successful collaboration and eventual improvements in community health. However, there is a paucity of research addressing linkages between systems change activities and leveraged resources. This study used a sample of collaboratives (N = 157) that received technical assistance and funding through the Georgia Family Connection Partnership (GaFCP) between 2006 and 2007. Data were collected from collaborative report of activities and funding, member ratings of collaborative functioning, and characteristics of the communities served by the collaboratives drawn from US Census data. Cross-lagged regression models tested longitudinal associations between systems change activities and leveraged dollars. The results indicated that systems change activities predict increased leveraging of resources from state/federal and private partners. However, there was no evidence that systems changes were linked with leveraging resources from local groups and agencies. These findings have important implications for providing technical assistance and training to health partnerships. Furthermore, future research should consider the relative strength of different systems change activities in relation to the ability of coalitions to leverage resources.
Leveraging social system networks in ubiquitous high-data-rate health systems.
Massey, Tammara; Marfia, Gustavo; Stoelting, Adam; Tomasi, Riccardo; Spirito, Maurizio A; Sarrafzadeh, Majid; Pau, Giovanni
2011-05-01
Social system networks with high data rates and limited storage will discard data if the system cannot connect and upload the data to a central server. We address the challenge of limited storage capacity in mobile health systems during network partitions with a heuristic that achieves efficiency in storage capacity by modifying the granularity of the medical data during long intercontact periods. Patterns in the connectivity, reception rate, distance, and location are extracted from the social system network and leveraged in the global algorithm and online heuristic. In the global algorithm, the stochastic nature of the data is modeled with maximum likelihood estimation based on the distribution of the reception rates. In the online heuristic, the correlation between system position and the reception rate is combined with patterns in human mobility to estimate the intracontact and intercontact time. The online heuristic performs well with a low data loss of 2.1%-6.1%.
Tomlinson, Mark; Rahman, Atif; Sanders, David; Maselko, Joanna; Rotheram-Borus, Mary Jane
2014-01-01
Children need to be protected in intergenerational networks, with parents who have positive mood, resources to feed their children, and skills to promote early childhood development (ECD). Globally, more than 200 million children are raised annually without these resources. This article reviews the potential contributions of increasing coverage and penetration of services for these children, challenges to achieving penetration of services in high-risk families, opportunities created by bundling multiple services within one provider, potential leveraging of paraprofessionals to deliver care, and mobilizing communities to support children in households at high risk for negative outcomes. We end with a number of suggestions for how to ensure the equitable scale-up of integrated ECD and nutrition services that take into account current global priorities, as well as coverage and penetration of services. © 2013 New York Academy of Sciences.
Leveraging the national cyberinfrastructure for biomedical research.
LeDuc, Richard; Vaughn, Matthew; Fonner, John M; Sullivan, Michael; Williams, James G; Blood, Philip D; Taylor, James; Barnett, William
2014-01-01
In the USA, the national cyberinfrastructure refers to a system of research supercomputer and other IT facilities and the high speed networks that connect them. These resources have been heavily leveraged by scientists in disciplines such as high energy physics, astronomy, and climatology, but until recently they have been little used by biomedical researchers. We suggest that many of the 'Big Data' challenges facing the medical informatics community can be efficiently handled using national-scale cyberinfrastructure. Resources such as the Extreme Science and Discovery Environment, the Open Science Grid, and Internet2 provide economical and proven infrastructures for Big Data challenges, but these resources can be difficult to approach. Specialized web portals, support centers, and virtual organizations can be constructed on these resources to meet defined computational challenges, specifically for genomics. We provide examples of how this has been done in basic biology as an illustration for the biomedical informatics community.
Leveraging the national cyberinfrastructure for biomedical research
LeDuc, Richard; Vaughn, Matthew; Fonner, John M; Sullivan, Michael; Williams, James G; Blood, Philip D; Taylor, James; Barnett, William
2014-01-01
In the USA, the national cyberinfrastructure refers to a system of research supercomputer and other IT facilities and the high speed networks that connect them. These resources have been heavily leveraged by scientists in disciplines such as high energy physics, astronomy, and climatology, but until recently they have been little used by biomedical researchers. We suggest that many of the ‘Big Data’ challenges facing the medical informatics community can be efficiently handled using national-scale cyberinfrastructure. Resources such as the Extreme Science and Discovery Environment, the Open Science Grid, and Internet2 provide economical and proven infrastructures for Big Data challenges, but these resources can be difficult to approach. Specialized web portals, support centers, and virtual organizations can be constructed on these resources to meet defined computational challenges, specifically for genomics. We provide examples of how this has been done in basic biology as an illustration for the biomedical informatics community. PMID:23964072
Leveraging Large-Scale Semantic Networks for Adaptive Robot Task Learning and Execution.
Boteanu, Adrian; St Clair, Aaron; Mohseni-Kabir, Anahita; Saldanha, Carl; Chernova, Sonia
2016-12-01
This work seeks to leverage semantic networks containing millions of entries encoding assertions of commonsense knowledge to enable improvements in robot task execution and learning. The specific application we explore in this project is object substitution in the context of task adaptation. Humans easily adapt their plans to compensate for missing items in day-to-day tasks, substituting a wrap for bread when making a sandwich, or stirring pasta with a fork when out of spoons. Robot plan execution, however, is far less robust, with missing objects typically leading to failure if the robot is not aware of alternatives. In this article, we contribute a context-aware algorithm that leverages the linguistic information embedded in the task description to identify candidate substitution objects without reliance on explicit object affordance information. Specifically, we show that the task context provided by the task labels within the action structure of a task plan can be leveraged to disambiguate information within a noisy large-scale semantic network containing hundreds of potential object candidates to identify successful object substitutions with high accuracy. We present two extensive evaluations of our work on both abstract and real-world robot tasks, showing that the substitutions made by our system are valid, accepted by users, and lead to a statistically significant reduction in robot learning time. In addition, we report the outcomes of testing our approach with a large number of crowd workers interacting with a robot in real time.
An Accurate Direction Finding Scheme Using Virtual Antenna Array via Smartphones.
Wang, Xiaopu; Xiong, Yan; Huang, Wenchao
2016-10-29
With the development of localization technologies, researchers solve the indoor localization problems using diverse methods and equipment. Most localization techniques require either specialized devices or fingerprints, which are inconvenient for daily use. Therefore, we propose and implement an accurate, efficient and lightweight system for indoor direction finding using common smartphones and loudspeakers. Our method is derived from a key insight: By moving a smartphone in regular patterns, we can effectively emulate the sensitivity and functionality of a Uniform Antenna Array to estimate the angle of arrival of the target signal. Specifically, a user only needs to hold his smartphone still in front of him, and then rotate his body around 360 ∘ duration with the smartphone at an approximate constant velocity. Then, our system can provide accurate directional guidance and lead the user to their destinations (normal loudspeakers we preset in the indoor environment transmitting high frequency acoustic signals) after a few measurements. Major challenges in implementing our system are not only imitating a virtual antenna array by ordinary smartphones but also overcoming the detection difficulties caused by the complex indoor environment. In addition, we leverage the gyroscope of the smartphone to reduce the impact of a user's motion pattern change to the accuracy of our system. In order to get rid of the multipath effect, we leverage multiple signal classification to calculate the direction of the target signal, and then design and deploy our system in various indoor scenes. Extensive comparative experiments show that our system is reliable under various circumstances.
Problem decomposition by mutual information and force-based clustering
NASA Astrophysics Data System (ADS)
Otero, Richard Edward
The scale of engineering problems has sharply increased over the last twenty years. Larger coupled systems, increasing complexity, and limited resources create a need for methods that automatically decompose problems into manageable sub-problems by discovering and leveraging problem structure. The ability to learn the coupling (inter-dependence) structure and reorganize the original problem could lead to large reductions in the time to analyze complex problems. Such decomposition methods could also provide engineering insight on the fundamental physics driving problem solution. This work forwards the current state of the art in engineering decomposition through the application of techniques originally developed within computer science and information theory. The work describes the current state of automatic problem decomposition in engineering and utilizes several promising ideas to advance the state of the practice. Mutual information is a novel metric for data dependence and works on both continuous and discrete data. Mutual information can measure both the linear and non-linear dependence between variables without the limitations of linear dependence measured through covariance. Mutual information is also able to handle data that does not have derivative information, unlike other metrics that require it. The value of mutual information to engineering design work is demonstrated on a planetary entry problem. This study utilizes a novel tool developed in this work for planetary entry system synthesis. A graphical method, force-based clustering, is used to discover related sub-graph structure as a function of problem structure and links ranked by their mutual information. This method does not require the stochastic use of neural networks and could be used with any link ranking method currently utilized in the field. Application of this method is demonstrated on a large, coupled low-thrust trajectory problem. Mutual information also serves as the basis for an alternative global optimizer, called MIMIC, which is unrelated to Genetic Algorithms. Advancement to the current practice demonstrates the use of MIMIC as a global method that explicitly models problem structure with mutual information, providing an alternate method for globally searching multi-modal domains. By leveraging discovered problem inter- dependencies, MIMIC may be appropriate for highly coupled problems or those with large function evaluation cost. This work introduces a useful addition to the MIMIC algorithm that enables its use on continuous input variables. By leveraging automatic decision tree generation methods from Machine Learning and a set of randomly generated test problems, decision trees for which method to apply are also created, quantifying decomposition performance over a large region of the design space.
DOE Office of Scientific and Technical Information (OSTI.GOV)
CHEN, JOANNA; SIMIRENKO, LISA; TAPASWI, MANJIRI
The DIVA software interfaces a process in which researchers design their DNA with a web-based graphical user interface, submit their designs to a central queue, and a few weeks later receive their sequence-verified clonal constructs. Each researcher independently designs the DNA to be constructed with a web-based BioCAD tool, and presses a button to submit their designs to a central queue. Researchers have web-based access to their DNA design queues, and can track the progress of their submitted designs as they progress from "evaluation", to "waiting for reagents", to "in progress", to "complete". Researchers access their completed constructs through themore » central DNA repository. Along the way, all DNA construction success/failure rates are captured in a central database. Once a design has been submitted to the queue, a small number of dedicated staff evaluate the design for feasibility and provide feedback to the responsible researcher if the design is either unreasonable (e.g., encompasses a combinatorial library of a billion constructs) or small design changes could significantly facilitate the downstream implementation process. The dedicated staff then use DNA assembly design automation software to optimize the DNA construction process for the design, leveraging existing parts from the DNA repository where possible and ordering synthetic DNA where necessary. SynTrack software manages the physical locations and availability of the various requisite reagents and process inputs (e.g., DNA templates). Once all requisite process inputs are available, the design progresses from "waiting for reagents" to "in progress" in the design queue. Human-readable and machine-parseable DNA construction protocols output by the DNA assembly design automation software are then executed by the dedicated staff exploiting lab automation devices wherever possible. Since the all employed DNA construction methods are sequence-agnostic, standardized (utilize the same enzymatic master mixes and reaction conditions), completely independent DNA construction tasks can be aggregated into the same multi-well plates and pursued in parallel. The resulting sets of cloned constructs can then be screened by high-throughput next-gen sequencing platforms for sequence correctness. A combination of long read-length (e.g., PacBio) and paired-end read platforms (e.g., Illumina) would be exploited depending the particular task at hand (e.g., PacBio might be sufficient to screen a set of pooled constructs with significant gene divergence). Post sequence verification, designs for which at least one correct clone was identified will progress to a "complete" status, while designs for which no correct clones wereidentified will progress to a "failure" status. Depending on the failure mode (e.g., no transformants), and how many prior attempts/variations of assembly protocol have been already made for a given design, subsequent attempts may be made or the design can progress to a "permanent failure" state. All success and failure rate information will be captured during the process, including at which stage a given clonal construction procedure failed (e.g., no PCR product) and what the exact failure was (e.g. assembly piece 2 missing). This success/failure rate data can be leveraged to refine the DNA assembly design process.« less
Makification: Towards a Framework for Leveraging the Maker Movement in Formal Education
ERIC Educational Resources Information Center
Cohen, Jonathan; Jones, W. Monty; Smith, Shaunna; Calandra, Brendan
2017-01-01
Maker culture is part of a burgeoning movement in which individuals leverage modern digital technologies to produce and share physical artifacts with a broader community. Certain components of the maker movement, if properly leveraged, hold promise for transforming formal education in a variety of contexts. The authors here work towards a…
13 CFR 107.1120 - General eligibility requirements for Leverage.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Enterprises in § 107.710(b). (h) Show, to the satisfaction of SBA, that your management is qualified and has... Information About Obtaining Leverage § 107.1120 General eligibility requirements for Leverage. To be eligible..., 1996 will be provided to Smaller Enterprises (as defined in § 107.710(a)); and (ii) You must...
13 CFR 107.1120 - General eligibility requirements for Leverage.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Enterprises in § 107.710(b). (h) Show, to the satisfaction of SBA, that your management is qualified and has... Information About Obtaining Leverage § 107.1120 General eligibility requirements for Leverage. To be eligible..., 1996 will be provided to Smaller Enterprises (as defined in § 107.710(a)); and (ii) You must...
13 CFR 107.1120 - General eligibility requirements for Leverage.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Enterprises in § 107.710(b). (h) Show, to the satisfaction of SBA, that your management is qualified and has... Information About Obtaining Leverage § 107.1120 General eligibility requirements for Leverage. To be eligible..., 1996 will be provided to Smaller Enterprises (as defined in § 107.710(a)); and (ii) You must...
Virtual reality 3D headset based on DMD light modulators
NASA Astrophysics Data System (ADS)
Bernacki, Bruce E.; Evans, Allan; Tang, Edward
2014-06-01
We present the design of an immersion-type 3D headset suitable for virtual reality applications based upon digital micromirror devices (DMD). Current methods for presenting information for virtual reality are focused on either polarizationbased modulators such as liquid crystal on silicon (LCoS) devices, or miniature LCD or LED displays often using lenses to place the image at infinity. LCoS modulators are an area of active research and development, and reduce the amount of viewing light by 50% due to the use of polarization. Viewable LCD or LED screens may suffer low resolution, cause eye fatigue, and exhibit a "screen door" or pixelation effect due to the low pixel fill factor. Our approach leverages a mature technology based on silicon micro mirrors delivering 720p resolution displays in a small form-factor with high fill factor. Supporting chip sets allow rapid integration of these devices into wearable displays with high-definition resolution and low power consumption, and many of the design methods developed for DMD projector applications can be adapted to display use. Potential applications include night driving with natural depth perception, piloting of UAVs, fusion of multiple sensors for pilots, training, vision diagnostics and consumer gaming. Our design concept is described in which light from the DMD is imaged to infinity and the user's own eye lens forms a real image on the user's retina resulting in a virtual retinal display.
Modeling and Simulation Tools for Heavy Lift Airships
NASA Technical Reports Server (NTRS)
Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John
2016-01-01
For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.
Military needs and forecast, 2
NASA Technical Reports Server (NTRS)
Goldstayn, Alan B.
1986-01-01
FORECAST 2 has accomplished its objectives of identifying high leverage technologies for corporate Air Force review. Implementation is underway with emphasis on restructuring existing programs and programming resources in the FY88 BES/FY89 POM. Many joint service/agency opportunities exist.
Agent-Based Model with Asymmetric Trading and Herding for Complex Financial Systems
Chen, Jun-Jie; Zheng, Bo; Tan, Lei
2013-01-01
Background For complex financial systems, the negative and positive return-volatility correlations, i.e., the so-called leverage and anti-leverage effects, are particularly important for the understanding of the price dynamics. However, the microscopic origination of the leverage and anti-leverage effects is still not understood, and how to produce these effects in agent-based modeling remains open. On the other hand, in constructing microscopic models, it is a promising conception to determine model parameters from empirical data rather than from statistical fitting of the results. Methods To study the microscopic origination of the return-volatility correlation in financial systems, we take into account the individual and collective behaviors of investors in real markets, and construct an agent-based model. The agents are linked with each other and trade in groups, and particularly, two novel microscopic mechanisms, i.e., investors’ asymmetric trading and herding in bull and bear markets, are introduced. Further, we propose effective methods to determine the key parameters in our model from historical market data. Results With the model parameters determined for six representative stock-market indices in the world, respectively, we obtain the corresponding leverage or anti-leverage effect from the simulation, and the effect is in agreement with the empirical one on amplitude and duration. At the same time, our model produces other features of the real markets, such as the fat-tail distribution of returns and the long-term correlation of volatilities. Conclusions We reveal that for the leverage and anti-leverage effects, both the investors’ asymmetric trading and herding are essential generation mechanisms. Among the six markets, however, the investors’ trading is approximately symmetric for the five markets which exhibit the leverage effect, thus contributing very little. These two microscopic mechanisms and the methods for the determination of the key parameters can be applied to other complex systems with similar asymmetries. PMID:24278146
Agent-based model with asymmetric trading and herding for complex financial systems.
Chen, Jun-Jie; Zheng, Bo; Tan, Lei
2013-01-01
For complex financial systems, the negative and positive return-volatility correlations, i.e., the so-called leverage and anti-leverage effects, are particularly important for the understanding of the price dynamics. However, the microscopic origination of the leverage and anti-leverage effects is still not understood, and how to produce these effects in agent-based modeling remains open. On the other hand, in constructing microscopic models, it is a promising conception to determine model parameters from empirical data rather than from statistical fitting of the results. To study the microscopic origination of the return-volatility correlation in financial systems, we take into account the individual and collective behaviors of investors in real markets, and construct an agent-based model. The agents are linked with each other and trade in groups, and particularly, two novel microscopic mechanisms, i.e., investors' asymmetric trading and herding in bull and bear markets, are introduced. Further, we propose effective methods to determine the key parameters in our model from historical market data. With the model parameters determined for six representative stock-market indices in the world, respectively, we obtain the corresponding leverage or anti-leverage effect from the simulation, and the effect is in agreement with the empirical one on amplitude and duration. At the same time, our model produces other features of the real markets, such as the fat-tail distribution of returns and the long-term correlation of volatilities. We reveal that for the leverage and anti-leverage effects, both the investors' asymmetric trading and herding are essential generation mechanisms. Among the six markets, however, the investors' trading is approximately symmetric for the five markets which exhibit the leverage effect, thus contributing very little. These two microscopic mechanisms and the methods for the determination of the key parameters can be applied to other complex systems with similar asymmetries.