Dal Palù, Alessandro; Pontelli, Enrico; He, Jing; Lu, Yonggang
2007-01-01
The paper describes a novel framework, constructed using Constraint Logic Programming (CLP) and parallelism, to determine the association between parts of the primary sequence of a protein and alpha-helices extracted from 3D low-resolution descriptions of large protein complexes. The association is determined by extracting constraints from the 3D information, regarding length, relative position and connectivity of helices, and solving these constraints with the guidance of a secondary structure prediction algorithm. Parallelism is employed to enhance performance on large proteins. The framework provides a fast, inexpensive alternative to determine the exact tertiary structure of unknown proteins.
2016-04-30
also that we have started building in a domain where structural patterns matter, especially for large projects. Complex Systems Complexity has been...through minimalistic thinking and parsimony” and perceived elegance, which “hides systemic or organizational complexity from the user.” If the system
Meeting the Needs of Children with Disabilities
ERIC Educational Resources Information Center
Aron, Laudan Y.; Loprest, Pamela J.
2007-01-01
Seldom do the needs of children with disabilities divide neatly along program lines. Instead, children and their families navigate a large, complex, and fragmented array of programs with inconsistent eligibility standards, application procedures, and program goals. "Meeting the Needs of Children with Disabilities" examines these programs, focusing…
Toren, Katelynne Gardner; Elsenboss, Carina; Narita, Masahiro
2017-01-01
Public Health—Seattle and King County, a metropolitan health department in western Washington, experiences rates of tuberculosis (TB) that are 1.6 times higher than are state and national averages. The department’s TB Control Program uses public health emergency management tools and capabilities sustained with Centers for Disease Control and Prevention grant funding to manage large-scale complex case investigations. We have described 3 contact investigations in large congregate settings that the TB Control Program conducted in 2015 and 2016. The program managed the investigations using public health emergency management tools, with support from the Preparedness Program. The 3 investigations encompassed medical evaluation of more than 1600 people, used more than 100 workers, identified nearly 30 individuals with latent TB infection, and prevented an estimated 3 cases of active disease. These incidents exemplify how investments in public health emergency preparedness can enhance health outcomes in traditional areas of public health. PMID:28892445
IP-Based Video Modem Extender Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pierson, L G; Boorman, T M; Howe, R E
2003-12-16
Visualization is one of the keys to understanding large complex data sets such as those generated by the large computing resources purchased and developed by the Advanced Simulation and Computing program (aka ASCI). In order to be convenient to researchers, visualization data must be distributed to offices and large complex visualization theaters. Currently, local distribution of the visual data is accomplished by distance limited modems and RGB switches that simply do not scale to hundreds of users across the local, metropolitan, and WAN distances without incurring large costs in fiber plant installation and maintenance. Wide Area application over the DOEmore » Complex is infeasible using these limited distance RGB extenders. On the other hand, Internet Protocols (IP) over Ethernet is a scalable well-proven technology that can distribute large volumes of data over these distances. Visual data has been distributed at lower resolutions over IP in industrial applications. This document describes requirements of the ASCI program in visual signal distribution for the purpose of identifying industrial partners willing to develop products to meet ASCI's needs.« less
NASA Astrophysics Data System (ADS)
Kashid, Satishkumar S.; Maity, Rajib
2012-08-01
SummaryPrediction of Indian Summer Monsoon Rainfall (ISMR) is of vital importance for Indian economy, and it has been remained a great challenge for hydro-meteorologists due to inherent complexities in the climatic systems. The Large-scale atmospheric circulation patterns from tropical Pacific Ocean (ENSO) and those from tropical Indian Ocean (EQUINOO) are established to influence the Indian Summer Monsoon Rainfall. The information of these two large scale atmospheric circulation patterns in terms of their indices is used to model the complex relationship between Indian Summer Monsoon Rainfall and the ENSO as well as EQUINOO indices. However, extracting the signal from such large-scale indices for modeling such complex systems is significantly difficult. Rainfall predictions have been done for 'All India' as one unit, as well as for five 'homogeneous monsoon regions of India', defined by Indian Institute of Tropical Meteorology. Recent 'Artificial Intelligence' tool 'Genetic Programming' (GP) has been employed for modeling such problem. The Genetic Programming approach is found to capture the complex relationship between the monthly Indian Summer Monsoon Rainfall and large scale atmospheric circulation pattern indices - ENSO and EQUINOO. Research findings of this study indicate that GP-derived monthly rainfall forecasting models, that use large-scale atmospheric circulation information are successful in prediction of All India Summer Monsoon Rainfall with correlation coefficient as good as 0.866, which may appears attractive for such a complex system. A separate analysis is carried out for All India Summer Monsoon rainfall for India as one unit, and five homogeneous monsoon regions, based on ENSO and EQUINOO indices of months of March, April and May only, performed at end of month of May. In this case, All India Summer Monsoon Rainfall could be predicted with 0.70 as correlation coefficient with somewhat lesser Correlation Coefficient (C.C.) values for different 'homogeneous monsoon regions'.
Science information systems: Visualization
NASA Technical Reports Server (NTRS)
Wall, Ray J.
1991-01-01
Future programs in earth science, planetary science, and astrophysics will involve complex instruments that produce data at unprecedented rates and volumes. Current methods for data display, exploration, and discovery are inadequate. Visualization technology offers a means for the user to comprehend, explore, and examine complex data sets. The goal of this program is to increase the effectiveness and efficiency of scientists in extracting scientific information from large volumes of instrument data.
Using the High-Level Based Program Interface to Facilitate the Large Scale Scientific Computing
Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong
2014-01-01
This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications. PMID:24574931
Automated sizing of large structures by mixed optimization methods
NASA Technical Reports Server (NTRS)
Sobieszczanski, J.; Loendorf, D.
1973-01-01
A procedure for automating the sizing of wing-fuselage airframes was developed and implemented in the form of an operational program. The program combines fully stressed design to determine an overall material distribution with mass-strength and mathematical programming methods to design structural details accounting for realistic design constraints. The practicality and efficiency of the procedure is demonstrated for transport aircraft configurations. The methodology is sufficiently general to be applicable to other large and complex structures.
Some thoughts on the management of large, complex international space ventures
NASA Technical Reports Server (NTRS)
Lee, T. J.; Kutzer, Ants; Schneider, W. C.
1992-01-01
Management issues relevant to the development and deployment of large international space ventures are discussed with particular attention given to previous experience. Management approaches utilized in the past are labeled as either simple or complex, and signs of efficient management are examined. Simple approaches include those in which experiments and subsystems are developed for integration into spacecraft, and the Apollo-Soyuz Test Project is given as an example of a simple multinational approach. Complex approaches include those for ESA's Spacelab Project and the Space Station Freedom in which functional interfaces cross agency and political boundaries. It is concluded that individual elements of space programs should be managed by individual participating agencies, and overall configuration control is coordinated by level with a program director acting to manage overall objectives and project interfaces.
ERIC Educational Resources Information Center
Grosch, Audrey N.
1973-01-01
A regionally organized program for serials bibliography is proposed because of the large volume of complex data needing control and the many purposes to which the data can be put in support of regional or local needs. (2 references) (Author)
The Design, Development and Testing of a Multi-process Real-time Software System
2007-03-01
programming large systems stems from the complexity of dealing with many different details at one time. A sound engineering approach is to break...controls and 3) is portable to other OS platforms such as Microsoft Windows. Next, to reduce the complexity of the programming tasks, the system...processes depending on how often the process has to check to see if common data was modified. A good method for one process to quickly notify another
Environmental projects. Volume 1: Polychlorinated biphenyl (PCB) abatement program
NASA Technical Reports Server (NTRS)
Kushner, L.
1987-01-01
Six large parabolic dish antennas are located at the Goldstone Deep Space Communications Complex north of Barstow, California. Some of the ancillary electrical equipment of thes Deep Space Stations, particularly transformers and power capicitors, were filled with stable, fire-retardant, dielectric fluids containing substances called polychlorobiphenyls (PCBs). Because the Environmental Protection Agency has determined that PCBs are environmental pollutants toxic to humans, all NASA centers have been asked to participate in a PCB-abatement program. Under the supervision of JPL's Office of Telecommunications and Data Acquisition, a two-year long PCB-abatement program has eliminated PCBs from the Goldstone Complex.
ERIC Educational Resources Information Center
McNamara, K. P.; O'Reilly, S. L.; George, J.; Peterson, G. M.; Jackson, S. L.; Duncan, G.; Howarth, H.; Dunbar, J. A.
2015-01-01
Background: Delivery of cardiovascular disease (CVD) prevention programs by community pharmacists appears effective and enhances health service access. However, their capacity to implement complex behavioural change processes during patient counselling remains largely unexplored. This study aims to determine intervention fidelity by pharmacists…
Stable isotopes can be very useful in large-scale monitoring programs because samples for isotopic analysis are easy to collect, and isotopes integrate information about complex processes such as evaporation from water isotopes and denitrification from nitrogen isotopes. Traditi...
Organizational Structures that Support Internal Program Evaluation
ERIC Educational Resources Information Center
Lambur, Michael T.
2008-01-01
This chapter explores how the structure of large complex organizations such as Cooperative Extension affects their ability to support internal evaluation of their programs and activities. Following a literature review of organizational structure and its relation to internal evaluation capacity, the chapter presents the results of interviews with…
Costing Complex Products, Operations, and Support
2011-04-30
Symposium, 10-12 May 2011, Seaside, CA. U.S. Government or Federal Rights License 14. ABSTRACT Complex products and systems (CoPS), such as large defense...Program Executive Officer SHIPS • Commander, Naval Sea Systems Command • Army Contracting Command, U.S. Army Materiel Command • Program Manager...Airborne, Maritime and Fixed Station Joint Tactical Radio System = ==================^Åèìáëáíáçå=oÉëÉ~êÅÜW=`ob^qfkd=pvkbodv=clo=fkclojba=`e^kdb=====- ii
An implementation of the distributed programming structural synthesis system (PROSSS)
NASA Technical Reports Server (NTRS)
Rogers, J. L., Jr.
1981-01-01
A method is described for implementing a flexible software system that combines large, complex programs with small, user-supplied, problem-dependent programs and that distributes their execution between a mainframe and a minicomputer. The Programming Structural Synthesis System (PROSSS) was the specific software system considered. The results of such distributed implementation are flexibility of the optimization procedure organization and versatility of the formulation of constraints and design variables.
ERIC Educational Resources Information Center
Silva-Maceda, Gabriela; Arjona-Villicaña, P. David; Castillo-Barrera, F. Edgar
2016-01-01
Learning to program is a complex task, and the impact of different pedagogical approaches to teach this skill has been hard to measure. This study examined the performance data of seven cohorts of students (N = 1168) learning programming under three different pedagogical approaches. These pedagogical approaches varied either in the length of the…
2000-05-05
This computer graphic depicts the relative complexity of crystallizing large proteins in order to study their structures through x-ray crystallography. Insulin is a vital protein whose structure has several subtle points that scientists are still trying to determine. Large molecules such as insuline are complex with structures that are comparatively difficult to understand. For comparison, a sugar molecule (which many people have grown as hard crystals in science glass) and a water molecule are shown. These images were produced with the Macmolecule program. Photo credit: NASA/Marshall Space Flight Center (MSFC)
Factors Associated with Attrition in Weight Loss Programs
ERIC Educational Resources Information Center
Grave, Riccardo Dalle; Suppini, Alessandro; Calugi, Simona; Marchesini, Giulio
2006-01-01
Attrition in weight loss programs is a complex process, influenced by patients' pretreatment characteristics and treatment variables, but available data are contradictory. Only a few variables have been confirmed by more than one study as relevant risk factors, but recently new data of clinical utility emerged from "real world" large observational…
Schwerner, Henry; Mellody, Timothy; Goldstein, Allan B; Wansink, Daryl; Sullivan, Virginia; Yelenik, Stephan N; Charlton, Warwick; Lloyd, Kelley; Courtemanche, Ted
2006-02-01
The objective of this study was to observe trends in payer expenditures for plan members with one of 14 chronic, complex conditions comparing one group with a disease management program specific to their condition (the intervention group) and the other with no specific disease management program (the control group) for these conditions. The authors used payer claims and membership data to identify members eligible for the program in a 12-month baseline year (October 2001 to September 2002) and a subsequent 12-month program year (October 2002 to September 2003). Two payers were analyzed: one health plan with members primarily in New Jersey (AmeriHealth New Jersey [AHNJ]), where the disease management program was offered, and one affiliated large plan with members primarily in the metro Philadelphia area, where the program was not offered. The claims payment policy for both plans is identical. Intervention and control groups were analyzed for equivalence. The analysis was conducted in both groups over identical time periods. The intervention group showed statistically significant (p < 0.01) differences in total paid claims trend and expenditures when compared to the control group. Intervention group members showed a reduction in expenditures of -8%, while control group members showed an increase of +10% over identical time periods. Subsequent analyses controlling for outliers and product lines served to confirm the overall results. The disease management program is likely responsible for the observed difference between the intervention and control group results. A well-designed, targeted disease management program offered by a motivated, supportive health plan can play an important role in cost improvement strategies for members with complex, chronic conditions.
Nyström, Monica Elisabeth; Strehlenert, Helena; Hansson, Johan; Hasson, Henna
2014-09-18
Large-scale change initiatives stimulating change in several organizational systems in the health and social care sector are challenging both to lead and evaluate. There is a lack of systematic research that can enrich our understanding of strategies to facilitate large system transformations in this sector. The purpose of this study was to examine the characteristics of core activities and strategies to facilitate implementation and change of a national program aimed at improving life for the most ill elderly people in Sweden. The program outcomes were also addressed to assess the impact of these strategies. A longitudinal case study design with multiple data collection methods was applied. Archival data (n = 795), interviews with key stakeholders (n = 11) and non-participant observations (n = 23) were analysed using content analysis. Outcome data was obtained from national quality registries. This study presents an approach for implementing a large national change program that is characterized by initial flexibility and dynamism regarding content and facilitation strategies and a growing complexity over time requiring more structure and coordination. The description of activities and strategies show that the program management team engaged a variety of stakeholders and actor groups and accordingly used a palate of different strategies. The main strategies used to influence change in the target organisations were to use regional improvement coaches, regional strategic management teams, national quality registries, financial incentives and annually revised agreements. Interactive learning sessions, intense communication, monitor and measurements, and active involvement of different experts and stakeholders, including elderly people, complemented these strategies. Program outcomes showed steady progress in most of the five target areas, less so for the target of achieving coordinated care. There is no blue-print on how to approach the challenging task of leading large scale change programs in complex contexts, but our conclusion is that more attention has to be given to the multidimensional strategies that program management need to consider. This multidimensionality comprises different strategies depending on types of actors, system levels, contextual factors, program progress over time, program content, types of learning and change processes, and the conditions for sustainability.
The Biophysics Microgravity Initiative
NASA Technical Reports Server (NTRS)
Gorti, S.
2016-01-01
Biophysical microgravity research on the International Space Station using biological materials has been ongoing for several decades. The well-documented substantive effects of long duration microgravity include the facilitation of the assembly of biological macromolecules into large structures, e.g., formation of large protein crystals under micro-gravity. NASA is invested not only in understanding the possible physical mechanisms of crystal growth, but also promoting two flight investigations to determine the influence of µ-gravity on protein crystal quality. In addition to crystal growth, flight investigations to determine the effects of shear on nucleation and subsequent formation of complex structures (e.g., crystals, fibrils, etc.) are also supported. It is now considered that long duration microgravity research aboard the ISS could also make possible the formation of large complex biological and biomimetic materials. Investigations of various materials undergoing complex structure formation in microgravity will not only strengthen NASA science programs, but may also provide invaluable insight towards the construction of large complex tissues, organs, or biomimetic materials on Earth.
Students' Explanations in Complex Learning of Disciplinary Programming
ERIC Educational Resources Information Center
Vieira, Camilo
2016-01-01
Computational Science and Engineering (CSE) has been denominated as the third pillar of science and as a set of important skills to solve the problems of a global society. Along with the theoretical and the experimental approaches, computation offers a third alternative to solve complex problems that require processing large amounts of data, or…
Can Models Capture the Complexity of the Systems Engineering Process?
NASA Astrophysics Data System (ADS)
Boppana, Krishna; Chow, Sam; de Weck, Olivier L.; Lafon, Christian; Lekkakos, Spyridon D.; Lyneis, James; Rinaldi, Matthew; Wang, Zhiyong; Wheeler, Paul; Zborovskiy, Marat; Wojcik, Leonard A.
Many large-scale, complex systems engineering (SE) programs have been problematic; a few examples are listed below (Bar-Yam, 2003 and Cullen, 2004), and many others have been late, well over budget, or have failed: Hilton/Marriott/American Airlines system for hotel reservations and flights; 1988-1992; 125 million; "scrapped"
Youth and the Workplace: Second-Chance Programs and the Hard-to-Serve.
ERIC Educational Resources Information Center
Smith, Thomas J.; And Others
The task of addressing the complex and deeply rooted problems faced by the nation's at-risk youth is one that largely falls outside the scope of traditional institutions. Investment in the development and operation of "second-chance" education and employment programs has historically been inadequate, haphazard, and uncertain. The gains…
Users manual for program NYQUIST: Liquid rocket nyquist plots developed for use on a PC computer
NASA Astrophysics Data System (ADS)
Armstrong, Wilbur C.
1992-06-01
The piping in a liquid rocket can assume complex configurations due to multiple tanks, multiple engines, and structures that must be piped around. The capability to handle some of these complex configurations have been incorporated into the NYQUIST code. The capability to modify the input on line has been implemented. The configurations allowed include multiple tanks, multiple engines, and the splitting of a pipe into unequal segments going to different (or the same) engines. This program will handle the following type elements: straight pipes, bends, inline accumulators, tuned stub accumulators, Helmholtz resonators, parallel resonators, pumps, split pipes, multiple tanks, and multiple engines. The code is too large to compile as one program using Microsoft FORTRAN 5; therefore, the code was broken into two segments: NYQUIST1.FOR and NYQUIST2.FOR. These are compiled separately and then linked together. The final run code is not too large (approximately equals 344,000 bytes).
Users manual for program NYQUIST: Liquid rocket nyquist plots developed for use on a PC computer
NASA Technical Reports Server (NTRS)
Armstrong, Wilbur C.
1992-01-01
The piping in a liquid rocket can assume complex configurations due to multiple tanks, multiple engines, and structures that must be piped around. The capability to handle some of these complex configurations have been incorporated into the NYQUIST code. The capability to modify the input on line has been implemented. The configurations allowed include multiple tanks, multiple engines, and the splitting of a pipe into unequal segments going to different (or the same) engines. This program will handle the following type elements: straight pipes, bends, inline accumulators, tuned stub accumulators, Helmholtz resonators, parallel resonators, pumps, split pipes, multiple tanks, and multiple engines. The code is too large to compile as one program using Microsoft FORTRAN 5; therefore, the code was broken into two segments: NYQUIST1.FOR and NYQUIST2.FOR. These are compiled separately and then linked together. The final run code is not too large (approximately equals 344,000 bytes).
Relative Sizes of Organic Molecules
NASA Technical Reports Server (NTRS)
2000-01-01
This computer graphic depicts the relative complexity of crystallizing large proteins in order to study their structures through x-ray crystallography. Insulin is a vital protein whose structure has several subtle points that scientists are still trying to determine. Large molecules such as insuline are complex with structures that are comparatively difficult to understand. For comparison, a sugar molecule (which many people have grown as hard crystals in science glass) and a water molecule are shown. These images were produced with the Macmolecule program. Photo credit: NASA/Marshall Space Flight Center (MSFC)
Chen, Xuehui; Sun, Yunxiang; An, Xiongbo; Ming, Dengming
2011-10-14
Normal mode analysis of large biomolecular complexes at atomic resolution remains challenging in computational structure biology due to the requirement of large amount of memory space and central processing unit time. In this paper, we present a method called virtual interface substructure synthesis method or VISSM to calculate approximate normal modes of large biomolecular complexes at atomic resolution. VISSM introduces the subunit interfaces as independent substructures that join contacting molecules so as to keep the integrity of the system. Compared with other approximate methods, VISSM delivers atomic modes with no need of a coarse-graining-then-projection procedure. The method was examined for 54 protein-complexes with the conventional all-atom normal mode analysis using CHARMM simulation program and the overlap of the first 100 low-frequency modes is greater than 0.7 for 49 complexes, indicating its accuracy and reliability. We then applied VISSM to the satellite panicum mosaic virus (SPMV, 78,300 atoms) and to F-actin filament structures of up to 39-mer, 228,813 atoms and found that VISSM calculations capture functionally important conformational changes accessible to these structures at atomic resolution. Our results support the idea that the dynamics of a large biomolecular complex might be understood based on the motions of its component subunits and the way in which subunits bind one another. © 2011 American Institute of Physics
Stan: A Probabilistic Programming Language for Bayesian Inference and Optimization
ERIC Educational Resources Information Center
Gelman, Andrew; Lee, Daniel; Guo, Jiqiang
2015-01-01
Stan is a free and open-source C++ program that performs Bayesian inference or optimization for arbitrary user-specified models and can be called from the command line, R, Python, Matlab, or Julia and has great promise for fitting large and complex statistical models in many areas of application. We discuss Stan from users' and developers'…
NASA Astrophysics Data System (ADS)
Michel, N.; Stoitsov, M. V.
2008-04-01
The fast computation of the Gauss hypergeometric function F12 with all its parameters complex is a difficult task. Although the F12 function verifies numerous analytical properties involving power series expansions whose implementation is apparently immediate, their use is thwarted by instabilities induced by cancellations between very large terms. Furthermore, small areas of the complex plane, in the vicinity of z=e, are inaccessible using F12 power series linear transformations. In order to solve these problems, a generalization of R.C. Forrey's transformation theory has been developed. The latter has been successful in treating the F12 function with real parameters. As in real case transformation theory, the large canceling terms occurring in F12 analytical formulas are rigorously dealt with, but by way of a new method, directly applicable to the complex plane. Taylor series expansions are employed to enter complex areas outside the domain of validity of power series analytical formulas. The proposed algorithm, however, becomes unstable in general when |a|, |b|, |c| are moderate or large. As a physical application, the calculation of the wave functions of the analytical Pöschl-Teller-Ginocchio potential involving F12 evaluations is considered. Program summaryProgram title: hyp_2F1, PTG_wf Catalogue identifier: AEAE_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAE_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 6839 No. of bytes in distributed program, including test data, etc.: 63 334 Distribution format: tar.gz Programming language: C++, Fortran 90 Computer: Intel i686 Operating system: Linux, Windows Word size: 64 bits Classification: 4.7 Nature of problem: The Gauss hypergeometric function F12, with all its parameters complex, is uniquely calculated in the frame of transformation theory with power series summations, thus providing a very fast algorithm. The evaluation of the wave functions of the analytical Pöschl-Teller-Ginocchio potential is treated as a physical application. Solution method: The Gauss hypergeometric function F12 verifies linear transformation formulas allowing consideration of arguments of a small modulus which then can be handled by a power series. They, however, give rise to indeterminate or numerically unstable cases, when b-a and c-a-b are equal or close to integers. They are properly dealt with through analytical manipulations of the Lanczos expression providing the Gamma function. The remaining zones of the complex plane uncovered by transformation formulas are dealt with Taylor expansions of the F12 function around complex points where linear transformations can be employed. The Pöschl-Teller-Ginocchio potential wave functions are calculated directly with F12 evaluations. Restrictions: The algorithm provides full numerical precision in almost all cases for |a|, |b|, and |c| of the order of one or smaller, but starts to be less precise or unstable when they increase, especially through a, b, and c imaginary parts. While it is possible to run the code for moderate or large |a|, |b|, and |c| and obtain satisfactory results for some specified values, the code is very likely to be unstable in this regime. Unusual features: Two different codes, one for the hypergeometric function and one for the Pöschl-Teller-Ginocchio potential wave functions, are provided in C++ and Fortran 90 versions. Running time: 20,000 F12 function evaluations take an average of one second.
Safety management of a complex R&D ground operating system
NASA Technical Reports Server (NTRS)
Connors, J.; Mauer, R. A.
1975-01-01
Report discusses safety program implementation for large R&D operating system. Analytical techniques are defined and suggested as tools for identifying potential hazards and determining means to effectively control or eliminate hazards.
Reduze - Feynman integral reduction in C++
NASA Astrophysics Data System (ADS)
Studerus, C.
2010-07-01
Reduze is a computer program for reducing Feynman integrals to master integrals employing a Laporta algorithm. The program is written in C++ and uses classes provided by the GiNaC library to perform the simplifications of the algebraic prefactors in the system of equations. Reduze offers the possibility to run reductions in parallel. Program summaryProgram title:Reduze Catalogue identifier: AEGE_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGE_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions:: yes No. of lines in distributed program, including test data, etc.: 55 433 No. of bytes in distributed program, including test data, etc.: 554 866 Distribution format: tar.gz Programming language: C++ Computer: All Operating system: Unix/Linux Number of processors used: The number of processors is problem dependent. More than one possible but not arbitrary many. RAM: Depends on the complexity of the system. Classification: 4.4, 5 External routines: CLN ( http://www.ginac.de/CLN/), GiNaC ( http://www.ginac.de/) Nature of problem: Solving large systems of linear equations with Feynman integrals as unknowns and rational polynomials as prefactors. Solution method: Using a Gauss/Laporta algorithm to solve the system of equations. Restrictions: Limitations depend on the complexity of the system (number of equations, number of kinematic invariants). Running time: Depends on the complexity of the system.
Platform options for the Space Station program
NASA Technical Reports Server (NTRS)
Mangano, M. J.; Rowley, R. W.
1986-01-01
Platforms for polar and 28.5 deg orbits were studied to determine the platform requirements and characteristics necessary to support the science objectives. Large platforms supporting the Earth-Observing System (EOS) were initially studied. Co-orbiting platforms were derived from these designs. Because cost estimates indicated that the large platform approach was likely to be too expensive, require several launches, and generally be excessively complex, studies of small platforms were undertaken. Results of these studies show the small platform approach to be technically feasible at lower overall cost. All designs maximized hardware inheritance from the Space Station program to reduce costs. Science objectives as defined at the time of these studies are largely achievable.
Large space structure damping design
NASA Technical Reports Server (NTRS)
Pilkey, W. D.; Haviland, J. K.
1983-01-01
Several FORTRAN subroutines and programs were developed which compute complex eigenvalues of a damped system using different approaches, and which rescale mode shapes to unit generalized mass and make rigid bodies orthogonal to each other. An analytical proof of a Minimum Constrained Frequency Criterion (MCFC) for a single damper is presented. A method to minimize the effect of control spill-over for large space structures is proposed. The characteristic equation of an undamped system with a generalized control law is derived using reanalysis theory. This equation can be implemented in computer programs for efficient eigenvalue analysis or control quasi synthesis. Methods to control vibrations in large space structure are reviewed and analyzed. The resulting prototype, using electromagnetic actuator, is described.
NASTRAN application for the prediction of aircraft interior noise
NASA Technical Reports Server (NTRS)
Marulo, Francesco; Beyer, Todd B.
1987-01-01
The application of a structural-acoustic analogy within the NASTRAN finite element program for the prediction of aircraft interior noise is presented. Some refinements of the method, which reduce the amount of computation required for large, complex structures, are discussed. Also, further improvements are proposed and preliminary comparisons with structural and acoustic modal data obtained for a large, composite cylinder are presented.
New Madrid Seismotectonic Program. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buschbach, T.C.
1986-06-01
The New Madrid Seismotectonic Program was a large-scale multidisciplinary effort that was designed to define the structural setting and tectonic history of the New Madrid area in order to realistically evaluate earthquake risks in the siting of nuclear facilities. The tectonic model proposed to explain the New Madrid seismicity is the ''zone of weakness'' model, which suggests that an ancient rift complex formed a zone of weakness in the earth's crust along which regional stresses are relieved. The Reelfoot Rift portion of the proposed rift complex is currently seismically active, and it must be considered capable and likely to bemore » exposed to large-magnitude earthquakes in the future. Earthquakes that occur in the Wabash Valley area are less abundant and generally have deeper hypocenters than earthquakes in the New Madrid area. The area of the Southern Indiana Arm must be considered to have seismic risk, although a lesser extent than the Reelfoot Rift. The east-west trending Rough Creek Graben is practically aseismic, probably in large part due to its orientation in the current stress field. The northwest-trending St. Louis Arm of the proposed rift complex includes a pattern of seismicity that extends from southern Illinois along the Mississippi River. This arm must be considered to have seismic risk, but because of the lack of development of a graben associated with the arm and the orientation of the arm in the current stress field, the risk appears to be less than in the Reelfoot Rift portion of the rift complex.« less
Optimization Research of Generation Investment Based on Linear Programming Model
NASA Astrophysics Data System (ADS)
Wu, Juan; Ge, Xueqian
Linear programming is an important branch of operational research and it is a mathematical method to assist the people to carry out scientific management. GAMS is an advanced simulation and optimization modeling language and it will combine a large number of complex mathematical programming, such as linear programming LP, nonlinear programming NLP, MIP and other mixed-integer programming with the system simulation. In this paper, based on the linear programming model, the optimized investment decision-making of generation is simulated and analyzed. At last, the optimal installed capacity of power plants and the final total cost are got, which provides the rational decision-making basis for optimized investments.
BigDataScript: a scripting language for data pipelines.
Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu
2015-01-01
The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. © The Author 2014. Published by Oxford University Press.
BigDataScript: a scripting language for data pipelines
Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu
2015-01-01
Motivation: The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. Results: We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. Availability and implementation: BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. Contact: pablo.e.cingolani@gmail.com PMID:25189778
Embedding Agile Practices within a Plan-Driven Hierarchical Project Life Cycle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Millard, W. David; Johnson, Daniel M.; Henderson, John M.
2014-07-28
Organizations use structured, plan-driven approaches to provide continuity, direction, and control to large, multi-year programs. Projects within these programs vary greatly in size, complexity, level of maturity, technical risk, and clarity of the development objectives. Organizations that perform exploratory research, evolutionary development, and other R&D activities can obtain the benefits of Agile practices without losing the benefits of their program’s overarching plan-driven structure. This paper describes application of Agile development methods on a large plan-driven sensor integration program. While the client employed plan-driven, requirements flow-down methodologies, tight project schedules and complex interfaces called for frequent end-to-end demonstrations to provide feedbackmore » during system development. The development process maintained the many benefits of plan-driven project execution with the rapid prototyping, integration, demonstration, and client feedback possible through Agile development methods. This paper also describes some of the tools and implementing mechanisms used to transition between and take advantage of each methodology, and presents lessons learned from the project management, system engineering, and developer’s perspectives.« less
A visual programming environment for the Navier-Stokes computer
NASA Technical Reports Server (NTRS)
Tomboulian, Sherryl; Crockett, Thomas W.; Middleton, David
1988-01-01
The Navier-Stokes computer is a high-performance, reconfigurable, pipelined machine designed to solve large computational fluid dynamics problems. Due to the complexity of the architecture, development of effective, high-level language compilers for the system appears to be a very difficult task. Consequently, a visual programming methodology has been developed which allows users to program the system at an architectural level by constructing diagrams of the pipeline configuration. These schematic program representations can then be checked for validity and automatically translated into machine code. The visual environment is illustrated by using a prototype graphical editor to program an example problem.
Velasco, Veronica; Griffin, Kenneth W; Antichi, Mariella; Celata, Corrado
2015-10-01
Across developed countries, experimentation with alcohol, tobacco, and other drugs often begins in the early adolescent years. Several evidence-based programs have been developed to prevent adolescent substance use. Many of the most rigorously tested and empirically supported prevention programs were initially developed and tested in the United States. Increasingly, these interventions are being adopted for use in Europe and throughout the world. This paper reports on a large-scale comprehensive initiative designed to select, adapt, implement, and sustain an evidence-based drug abuse prevention program in Italy. As part of a large-scale regionally funded collaboration in the Lombardy region of Italy, we report on processes through which a team of stakeholders selected, translated and culturally adapted, planned, implemented and evaluated the Life Skills Training (LST) school-based drug abuse prevention program, an evidence-based intervention developed in the United States. We discuss several challenges and lessons learned and implications for prevention practitioners and researchers attempting to undertake similar international dissemination projects. We review several published conceptual models designed to promote the replication and widespread dissemination of effective programs, and discuss their strengths and limitations in the context of planning and implementing a complex, large-scale real-world dissemination effort. Copyright © 2015 Elsevier Ltd. All rights reserved.
Complexity as a Factor of Quality and Cost in Large Scale Software Development.
1979-12-01
allocating testing resources." [69 69I V. THE ROLE OF COMPLEXITY IN RESOURCE ESTIMATION AND ALLOCATION A. GENERAL It can be argued that blame for the...and allocation of testing resource by - identifying independent substructures and - identifying heavily used logic paths. 2. Setting a Design Threshold... RESOURCE ESTIMATION -------- 70 1. New Dynamic Field ------------------------- 70 2. Quality and Testing ----------------------- 71 3. Programming Units of
Direct heuristic dynamic programming for damping oscillations in a large power system.
Lu, Chao; Si, Jennie; Xie, Xiaorong
2008-08-01
This paper applies a neural-network-based approximate dynamic programming method, namely, the direct heuristic dynamic programming (direct HDP), to a large power system stability control problem. The direct HDP is a learning- and approximation-based approach to addressing nonlinear coordinated control under uncertainty. One of the major design parameters, the controller learning objective function, is formulated to directly account for network-wide low-frequency oscillation with the presence of nonlinearity, uncertainty, and coupling effect among system components. Results include a novel learning control structure based on the direct HDP with applications to two power system problems. The first case involves static var compensator supplementary damping control, which is used to provide a comprehensive evaluation of the learning control performance. The second case aims at addressing a difficult complex system challenge by providing a new solution to a large interconnected power network oscillation damping control problem that frequently occurs in the China Southern Power Grid.
Das, Ravi; Bhattacharjee, Shatabdi; Patel, Atit A; Harris, Jenna M; Bhattacharya, Surajit; Letcher, Jamin M; Clark, Sarah G; Nanda, Sumit; Iyer, Eswar Prasad R; Ascoli, Giorgio A; Cox, Daniel N
2017-12-01
Transcription factors (TFs) have emerged as essential cell autonomous mediators of subtype specific dendritogenesis; however, the downstream effectors of these TFs remain largely unknown, as are the cellular events that TFs control to direct morphological change. As dendritic morphology is largely dictated by the organization of the actin and microtubule (MT) cytoskeletons, elucidating TF-mediated cytoskeletal regulatory programs is key to understanding molecular control of diverse dendritic morphologies. Previous studies in Drosophila melanogaster have demonstrated that the conserved TFs Cut and Knot exert combinatorial control over aspects of dendritic cytoskeleton development, promoting actin and MT-based arbor morphology, respectively. To investigate transcriptional targets of Cut and/or Knot regulation, we conducted systematic neurogenomic studies, coupled with in vivo genetic screens utilizing multi-fluor cytoskeletal and membrane marker reporters. These analyses identified a host of putative Cut and/or Knot effector molecules, and a subset of these putative TF targets converge on modulating dendritic cytoskeletal architecture, which are grouped into three major phenotypic categories, based upon neuromorphometric analyses: complexity enhancer, complexity shifter, and complexity suppressor. Complexity enhancer genes normally function to promote higher order dendritic growth and branching with variable effects on MT stabilization and F-actin organization, whereas complexity shifter and complexity suppressor genes normally function in regulating proximal-distal branching distribution or in restricting higher order branching complexity, respectively, with spatially restricted impacts on the dendritic cytoskeleton. Collectively, we implicate novel genes and cellular programs by which TFs distinctly and combinatorially govern dendritogenesis via cytoskeletal modulation. Copyright © 2017 by the Genetics Society of America.
Configuration Management at NASA
NASA Technical Reports Server (NTRS)
Doreswamy, Rajiv
2013-01-01
NASA programs are characterized by complexity, harsh environments and the fact that we usually have one chance to get it right. Programs last decades and need to accept new hardware and technology as it is developed. We have multiple suppliers and international partners Our challenges are many, our costs are high and our failures are highly visible. CM systems need to be scalable, adaptable to new technology and span the life cycle of the program (30+ years). Multiple Systems, Contractors and Countries added major levels of complexity to the ISS program and CM/DM and Requirements management systems center dot CM Systems need to be designed for long design life center dot Space Station Design started in 1984 center dot Assembly Complete in 2012 center dot Systems were developed on a task basis without an overall system perspective center dot Technology moves faster than a large project office, try to make sure you have a system that can adapt
Development and Evaluation of a Pharmacogenomics Educational Program for Pharmacists
Formea, Christine M.; Nicholson, Wayne T.; McCullough, Kristen B.; Berg, Kevin D.; Berg, Melody L.; Cunningham, Julie L.; Merten, Julianna A.; Ou, Narith N.; Stollings, Joanna L.
2013-01-01
Objectives. To evaluate hospital and outpatient pharmacists’ pharmacogenomics knowledge before and 2 months after participating in a targeted, case-based pharmacogenomics continuing education program. Design. As part of a continuing education program accredited by the Accreditation Council for Pharmacy Education (ACPE), pharmacists were provided with a fundamental pharmacogenomics education program. Evaluation. An 11-question, multiple-choice, electronic survey instrument was distributed to 272 eligible pharmacists at a single campus of a large, academic healthcare system. Pharmacists improved their pharmacogenomics test scores by 0.7 questions (pretest average 46%; posttest average 53%, p=0.0003). Conclusions. Although pharmacists demonstrated improvement, overall retention of educational goals and objectives was marginal. These results suggest that the complex topic of pharmacogenomics requires a large educational effort in order to increase pharmacists’ knowledge and comfort level with this emerging therapeutic opportunity. PMID:23459098
Development and evaluation of a pharmacogenomics educational program for pharmacists.
Formea, Christine M; Nicholson, Wayne T; McCullough, Kristen B; Berg, Kevin D; Berg, Melody L; Cunningham, Julie L; Merten, Julianna A; Ou, Narith N; Stollings, Joanna L
2013-02-12
Objectives. To evaluate hospital and outpatient pharmacists' pharmacogenomics knowledge before and 2 months after participating in a targeted, case-based pharmacogenomics continuing education program.Design. As part of a continuing education program accredited by the Accreditation Council for Pharmacy Education (ACPE), pharmacists were provided with a fundamental pharmacogenomics education program.Evaluation. An 11-question, multiple-choice, electronic survey instrument was distributed to 272 eligible pharmacists at a single campus of a large, academic healthcare system. Pharmacists improved their pharmacogenomics test scores by 0.7 questions (pretest average 46%; posttest average 53%, p=0.0003).Conclusions. Although pharmacists demonstrated improvement, overall retention of educational goals and objectives was marginal. These results suggest that the complex topic of pharmacogenomics requires a large educational effort in order to increase pharmacists' knowledge and comfort level with this emerging therapeutic opportunity.
NASA Technical Reports Server (NTRS)
Scholl, R. E. (Editor)
1979-01-01
Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.
Scanlon, Dennis P; Wolf, Laura J; Alexander, Jeffrey A; Christianson, Jon B; Greene, Jessica; Jean-Jacques, Muriel; McHugh, Megan; Shi, Yunfeng; Leitzell, Brigitt; Vanderbrink, Jocelyn M
2016-08-01
The Aligning Forces for Quality (AF4Q) initiative was the Robert Wood Johnson Foundation's (RWJF's) signature effort to increase the overall quality of healthcare in targeted communities throughout the country. In addition to sponsoring this 16-site complex program, RWJF funded an independent scientific evaluation to support objective research on the initiative's effectiveness and contributions to basic knowledge in 5 core programmatic areas. The research design, data, and challenges faced during the summative evaluation phase of this near decade-long program are discussed. A descriptive overview of the summative research design and its development for a multi-site, community-based, healthcare quality improvement initiative is provided. The summative research design employed by the evaluation team is discussed. The evaluation team's summative research design involved a data-driven assessment of the effectiveness of the AF4Q program at large, assessments of the impact of AF4Q in the specific programmatic areas, and an assessment of how the AF4Q alliances were positioned for the future at the end of the program. The AF4Q initiative was the largest privately funded community-based healthcare improvement initiative in the United States to date and was implemented at a time of rapid change in national healthcare policy. The implementation of large-scale, multi-site initiatives is becoming an increasingly common approach for addressing problems in healthcare. The summative evaluation research design for the AF4Q initiative, and the lessons learned from its approach, may be valuable to others tasked with evaluating similarly complex community-based initiatives.
Mehdipanah, Roshanak; Manzano, Ana; Borrell, Carme; Malmusi, Davide; Rodriguez-Sanz, Maica; Greenhalgh, Joanne; Muntaner, Carles; Pawson, Ray
2015-01-01
Urban populations are growing and to accommodate these numbers, cities are becoming more involved in urban renewal programs to improve the physical, social and economic conditions in different areas. This paper explores some of the complexities surrounding the link between urban renewal, health and health inequalities using a theory-driven approach. We focus on an urban renewal initiative implemented in Barcelona, the Neighbourhoods Law, targeting Barcelona's (Spain) most deprived neighbourhoods. We present evidence from two studies on the health evaluation of the Neighbourhoods Law, while drawing from recent urban renewal literature, to follow a four-step process to develop a program theory. We then use two specific urban renewal interventions, the construction of a large central plaza and the repair of streets and sidewalks, to further examine this link. In order for urban renewal programs to affect health and health inequality, neighbours must use and adapt to the changes produced by the intervention. However, there exist barriers that can result in negative outcomes including factors such as accessibility, safety and security. This paper provides a different perspective to the field that is largely dominated by traditional quantitative studies that are not always able to address the complexities such interventions provide. Furthermore, the framework and discussions serve as a guide for future research, policy development and evaluation. Copyright © 2014 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Walsh, S. M. Steve
A study was conducted to determine why such a small number (less than 2 percent) of the approximately 9,000 adult male prisoners housed in the 3 complexes of the California Institution for Men (Chino, California) in the mid-1980s were actively participating in the college program offered at the prison sites. Data were collected through interviews…
1977-01-26
Sisteme Matematicheskogo Obespecheniya YeS EVM [ Applied Programs in the Software System for the Unified System of Computers], by A. Ye. Fateyev, A. I...computerized systems are most effective in large production complexes , in which the level of utilization of computers can be as high as 500,000...performance of these tasks could be furthered by the complex introduction of electronic computers in automated control systems. The creation of ASU
Yu, Xiaoliang; Li, Yun; Chen, Qin; Su, Chenhe; Zhang, Zili; Yang, Chengkui; Hu, Zhilin; Hou, Jue; Zhou, Jinying; Gong, Ling; Jiang, Xuejun
2015-01-01
ABSTRACT Receptor-interacting protein kinase 3 (RIP3) and its substrate mixed-lineage kinase domain-like protein (MLKL) are core regulators of programmed necrosis. The elimination of pathogen-infected cells by programmed necrosis acts as an important host defense mechanism. Here, we report that human herpes simplex virus 1 (HSV-1) and HSV-2 had opposite impacts on programmed necrosis in human cells versus their impacts in mouse cells. Similar to HSV-1, HSV-2 infection triggered programmed necrosis in mouse cells. However, neither HSV-1 nor HSV-2 infection was able to induce programmed necrosis in human cells. Moreover, HSV-1 or HSV-2 infection in human cells blocked tumor necrosis factor (TNF)-induced necrosis by preventing the induction of an RIP1/RIP3 necrosome. The HSV ribonucleotide reductase large subunit R1 was sufficient to suppress TNF-induced necrosis, and its RIP homotypic interaction motif (RHIM) domain was required to disrupt the RIP1/RIP3 complex in human cells. Therefore, this study provides evidence that HSV has likely evolved strategies to evade the host defense mechanism of programmed necrosis in human cells. IMPORTANCE This study demonstrated that infection with HSV-1 and HSV-2 blocked TNF-induced necrosis in human cells while these viruses directly activated programmed necrosis in mouse cells. Expression of HSV R1 suppressed TNF-induced necrosis of human cells. The RHIM domain of R1 was essential for its association with human RIP3 and RIP1, leading to disruption of the RIP1/RIP3 complex. This study provides new insights into the species-specific modulation of programmed necrosis by HSV. PMID:26559832
Yu, Xiaoliang; Li, Yun; Chen, Qin; Su, Chenhe; Zhang, Zili; Yang, Chengkui; Hu, Zhilin; Hou, Jue; Zhou, Jinying; Gong, Ling; Jiang, Xuejun; Zheng, Chunfu; He, Sudan
2016-01-15
Receptor-interacting protein kinase 3 (RIP3) and its substrate mixed-lineage kinase domain-like protein (MLKL) are core regulators of programmed necrosis. The elimination of pathogen-infected cells by programmed necrosis acts as an important host defense mechanism. Here, we report that human herpes simplex virus 1 (HSV-1) and HSV-2 had opposite impacts on programmed necrosis in human cells versus their impacts in mouse cells. Similar to HSV-1, HSV-2 infection triggered programmed necrosis in mouse cells. However, neither HSV-1 nor HSV-2 infection was able to induce programmed necrosis in human cells. Moreover, HSV-1 or HSV-2 infection in human cells blocked tumor necrosis factor (TNF)-induced necrosis by preventing the induction of an RIP1/RIP3 necrosome. The HSV ribonucleotide reductase large subunit R1 was sufficient to suppress TNF-induced necrosis, and its RIP homotypic interaction motif (RHIM) domain was required to disrupt the RIP1/RIP3 complex in human cells. Therefore, this study provides evidence that HSV has likely evolved strategies to evade the host defense mechanism of programmed necrosis in human cells. This study demonstrated that infection with HSV-1 and HSV-2 blocked TNF-induced necrosis in human cells while these viruses directly activated programmed necrosis in mouse cells. Expression of HSV R1 suppressed TNF-induced necrosis of human cells. The RHIM domain of R1 was essential for its association with human RIP3 and RIP1, leading to disruption of the RIP1/RIP3 complex. This study provides new insights into the species-specific modulation of programmed necrosis by HSV. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
NASA Astrophysics Data System (ADS)
Azadeh, A.; Foroozan, H.; Ashjari, B.; Motevali Haghighi, S.; Yazdanparast, R.; Saberi, M.; Torki Nejad, M.
2017-10-01
ISs and ITs play a critical role in large complex gas corporations. Many factors such as human, organisational and environmental factors affect IS in an organisation. Therefore, investigating ISs success is considered to be a complex problem. Also, because of the competitive business environment and the high amount of information flow in organisations, new issues like resilient ISs and successful customer relationship management (CRM) have emerged. A resilient IS will provide sustainable delivery of information to internal and external customers. This paper presents an integrated approach to enhance and optimise the performance of each component of a large IS based on CRM and resilience engineering (RE) in a gas company. The enhancement of the performance can help ISs to perform business tasks efficiently. The data are collected from standard questionnaires. It is then analysed by data envelopment analysis by selecting the optimal mathematical programming approach. The selected model is validated and verified by principle component analysis method. Finally, CRM and RE factors are identified as influential factors through sensitivity analysis for this particular case study. To the best of our knowledge, this is the first study for performance assessment and optimisation of large IS by combined RE and CRM.
C++, objected-oriented programming, and astronomical data models
NASA Technical Reports Server (NTRS)
Farris, A.
1992-01-01
Contemporary astronomy is characterized by increasingly complex instruments and observational techniques, higher data collection rates, and large data archives, placing severe stress on software analysis systems. The object-oriented paradigm represents a significant new approach to software design and implementation that holds great promise for dealing with this increased complexity. The basic concepts of this approach will be characterized in contrast to more traditional procedure-oriented approaches. The fundamental features of objected-oriented programming will be discussed from a C++ programming language perspective, using examples familiar to astronomers. This discussion will focus on objects, classes and their relevance to the data type system; the principle of information hiding; and the use of inheritance to implement generalization/specialization relationships. Drawing on the object-oriented approach, features of a new database model to support astronomical data analysis will be presented.
Computational Aspects of Heat Transfer in Structures
NASA Technical Reports Server (NTRS)
Adelman, H. M. (Compiler)
1982-01-01
Techniques for the computation of heat transfer and associated phenomena in complex structures are examined with an emphasis on reentry flight vehicle structures. Analysis methods, computer programs, thermal analysis of large space structures and high speed vehicles, and the impact of computer systems are addressed.
Development of a Catalytic Combustor for Aircraft Gas Turbine Engines.
1976-09-22
80 VI. DESIGN OF 7.6 CI CIANETE COMBUSTORS . . . . . . . . . . . 86 1. Design and Fabrication of CombusLors for Large Scale T est in...obtained for this program included round holes of different diameters, squares, rectangles, triangles, and other more complex hollow configurations
USDA-ARS?s Scientific Manuscript database
Classical quantitative genetics aids crop improvement by providing the means to estimate heritability, genetic correlations, and predicted responses to various selection schemes. Genomics has the potential to aid quantitative genetics and applied crop improvement programs via large-scale, high-thro...
NASA Astrophysics Data System (ADS)
Niemann, Brand Lee
A major field program to study beta-mesoscale transport and dispersion over complex mountainous terrain was conducted during 1969 with the cooperation of three government agencies at the White Sands Missile Range in central Utah. The purpose of the program was to measure simultaneously on a large number of days the synoptic and mesoscale wind fields, the relative dispersion between pairs of particle trajectories and the rate of small scale turbulence dissipation. The field program included measurements during more than 60 days in the months of March, June, and November. The large quantity of data generated from this program has been processed and analyzed to provide case studies and statistics to evaluate and refine Lagrangian variable trajectory models. The case studies selected to illustrate the complexities of mesoscale transport and dispersion over complex terrain include those with terrain blocking, lee waves, and stagnation, as well as those with large vertical wind shears and horizontal wind field deformation. The statistics of relative particle dispersion were computed and compared to the classical theories of Richardson and Batchelor and the more recent theories of Lin and Kao among others. The relative particle dispersion was generally found to increase with travel time in the alongwind and crosswind directions, but in a more oscillatory than sustained or even accelerated manner as predicted by most theories, unless substantial wind shears or finite vertical separations between particles were present. The relative particle dispersion in the vertical was generally found to be small and bounded even when substantial vertical motions due to lee waves were present because of the limiting effect of stable temperature stratification. The data show that velocity shears have a more significant effect than turbulence on relative particle dispersion and that sufficient turbulence may not always be present above the planetary boundary layer for "wind direction shear induced dispersion" to become effective horizontal dispersion by vertical mixing over the shear layer. The statistics of relative particle dispersion in the three component directions have been summarized and stratified by flow parameters for use in practical prediction problems.
DnaSAM: Software to perform neutrality testing for large datasets with complex null models.
Eckert, Andrew J; Liechty, John D; Tearse, Brandon R; Pande, Barnaly; Neale, David B
2010-05-01
Patterns of DNA sequence polymorphisms can be used to understand the processes of demography and adaptation within natural populations. High-throughput generation of DNA sequence data has historically been the bottleneck with respect to data processing and experimental inference. Advances in marker technologies have largely solved this problem. Currently, the limiting step is computational, with most molecular population genetic software allowing a gene-by-gene analysis through a graphical user interface. An easy-to-use analysis program that allows both high-throughput processing of multiple sequence alignments along with the flexibility to simulate data under complex demographic scenarios is currently lacking. We introduce a new program, named DnaSAM, which allows high-throughput estimation of DNA sequence diversity and neutrality statistics from experimental data along with the ability to test those statistics via Monte Carlo coalescent simulations. These simulations are conducted using the ms program, which is able to incorporate several genetic parameters (e.g. recombination) and demographic scenarios (e.g. population bottlenecks). The output is a set of diversity and neutrality statistics with associated probability values under a user-specified null model that are stored in easy to manipulate text file. © 2009 Blackwell Publishing Ltd.
The application of dynamic programming in production planning
NASA Astrophysics Data System (ADS)
Wu, Run
2017-05-01
Nowadays, with the popularity of the computers, various industries and fields are widely applying computer information technology, which brings about huge demand for a variety of application software. In order to develop software meeting various needs with most economical cost and best quality, programmers must design efficient algorithms. A superior algorithm can not only soul up one thing, but also maximize the benefits and generate the smallest overhead. As one of the common algorithms, dynamic programming algorithms are used to solving problems with some sort of optimal properties. When solving problems with a large amount of sub-problems that needs repetitive calculations, the ordinary sub-recursive method requires to consume exponential time, and dynamic programming algorithm can reduce the time complexity of the algorithm to the polynomial level, according to which we can conclude that dynamic programming algorithm is a very efficient compared to other algorithms reducing the computational complexity and enriching the computational results. In this paper, we expound the concept, basic elements, properties, core, solving steps and difficulties of the dynamic programming algorithm besides, establish the dynamic programming model of the production planning problem.
Cultural Factors in Managing an FMS Case Program: Saudi Arabian Army Ordnance Corps (SOCP) Program
1977-11-01
which included the purchase of large amounts of US;--’,oducee current generation self-Dromelled artillery, personnel earri- ero, tanks, mortar carriers...exores:ecd when attempting, to discuss 13 complex, sophisticated technical material with senior counterparts who possessed relative fluency in...i.ored -:ith ’ mop ity; they crnnot be rvoided; the: can to a rrroat extent be anticipated as critical man- cement factors. Bfy anticipating and preparing
Habitat Complexity Metrics to Guide Restoration of Large Rivers
NASA Astrophysics Data System (ADS)
Jacobson, R. B.; McElroy, B. J.; Elliott, C.; DeLonay, A.
2011-12-01
Restoration strategies on large, channelized rivers typically strive to recover lost habitat complexity, based on the assumption complexity and biophysical capacity are directly related. Although definition of links between complexity and biotic responses can be tenuous, complexity metrics have appeal because of their potential utility in quantifying habitat quality, defining reference conditions and design criteria, and measuring restoration progress. Hydroacoustic instruments provide many ways to measure complexity on large rivers, yet substantive questions remain about variables and scale of complexity that are meaningful to biota, and how complexity can be measured and monitored cost effectively. We explore these issues on the Missouri River, using the example of channel re-engineering projects that are intended to aid in recovery of the pallid sturgeon, an endangered benthic fish. We are refining understanding of what habitat complexity means for adult fish by combining hydroacoustic habitat assessments with acoustic telemetry to map locations during reproductive migrations and spawning. These data indicate that migrating sturgeon select points with relatively low velocity but adjacent to areas of high velocity (that is, with high velocity gradients); the integration of points defines pathways which minimize energy expenditures during upstream migrations of 10's to 100's of km. Complexity metrics that efficiently quantify migration potential at the reach scale are therefore directly relevant to channel restoration strategies. We are also exploring complexity as it relates to larval sturgeon dispersal. Larvae may drift for as many as 17 days (100's of km at mean velocities) before using up their yolk sac, after which they "settle" into habitats where they initiate feeding. An assumption underlying channel re-engineering is that additional channel complexity, specifically increased shallow, slow water, is necessary for early feeding and refugia. Development of complexity metrics is complicated by the fact that characteristics of channel morphology may increase complexity scores without necessarily increasing biophysical capacity for target species. For example, a cross section that samples depths and velocities across the thalweg (navigation channel) and into lentic habitat may score high on most measures of hydraulic or geomorphic complexity, but does not necessarily provide habitats beneficial to native species. Complexity measures need to be bounded by best estimates of native species requirements. In the absence of specific information, creation of habitat complexity for the sake of complexity may lead to unintended consequences, for example, lentic habitats that increase a complexity score but support invasive species. An additional practical constraint on complexity measures is the need to develop metrics that are can be deployed cost-effectively in an operational monitoring program. Design of a monitoring program requires informed choices of measurement variables, definition of reference sites, and design of sampling effort to capture spatial and temporal variability.
Enhancer Activation Requires Trans-Recruitment of a Mega Transcription Factor Complex
Liu, Zhijie; Merkurjev, Daria; Yang, Feng; Li, Wenbo; Oh, Soohwan; Friedman, Meyer J.; Song, Xiaoyuan; Zhang, Feng; Ma, Qi; Ohgi, Kenneth; Krones, Anna; Rosenfeld, Michael G.
2014-01-01
Summary Enhancers provide critical information directing cell-type specific transcriptional programs, regulated by binding of signal-dependent transcription factors and their associated cofactors. Here we report that the most strongly activated estrogen (E2)-responsive enhancers are characterized by trans-recruitment and in situ assembly of a large 1-2 MDa complex of diverse DNA-binding transcription factors by ERα at ERE-containing enhancers. We refer to enhancers recruiting these factors as mega transcription factor-bound in trans (MegaTrans) enhancers. The MegaTrans complex is a signature of the most potent functional enhancers and is required for activation of enhancer RNA transcription and recruitment of coactivators, including p300 and Med1. The MegaTrans complex functions, in part, by recruiting specific enzymatic machinery, exemplified by DNA-dependent protein kinase. Thus, MegaTrans-containing enhancers represent a cohort of functional enhancers that mediate a broad and important transcriptional program and provide a molecular explanation for transcription factor clustering and hotspots noted in the genome. PMID:25303530
Development of visual 3D virtual environment for control software
NASA Technical Reports Server (NTRS)
Hirose, Michitaka; Myoi, Takeshi; Amari, Haruo; Inamura, Kohei; Stark, Lawrence
1991-01-01
Virtual environments for software visualization may enable complex programs to be created and maintained. A typical application might be for control of regional electric power systems. As these encompass broader computer networks than ever, construction of such systems becomes very difficult. Conventional text-oriented environments are useful in programming individual processors. However, they are obviously insufficient to program a large and complicated system, that includes large numbers of computers connected to each other; such programming is called 'programming in the large.' As a solution for this problem, the authors are developing a graphic programming environment wherein one can visualize complicated software in virtual 3D world. One of the major features of the environment is the 3D representation of concurrent process. 3D representation is used to supply both network-wide interprocess programming capability (capability for 'programming in the large') and real-time programming capability. The authors' idea is to fuse both the block diagram (which is useful to check relationship among large number of processes or processors) and the time chart (which is useful to check precise timing for synchronization) into a single 3D space. The 3D representation gives us a capability for direct and intuitive planning or understanding of complicated relationship among many concurrent processes. To realize the 3D representation, a technology to enable easy handling of virtual 3D object is a definite necessity. Using a stereo display system and a gesture input device (VPL DataGlove), our prototype of the virtual workstation has been implemented. The workstation can supply the 'sensation' of the virtual 3D space to a programmer. Software for the 3D programming environment is implemented on the workstation. According to preliminary assessments, a 50 percent reduction of programming effort is achieved by using the virtual 3D environment. The authors expect that the 3D environment has considerable potential in the field of software engineering.
Large Crawler Crane for new lightning protection system
2007-10-25
A large crawler crane arrives at the turn basin at the Launch Complex 39 Area on NASA's Kennedy Space Center. The crane with its 70-foot boom will be moved to Launch Pad 39B and used to construct a new lightning protection system for the Constellation Program and Ares/Orion launches. Pad B will be the site of the first Ares vehicle launch, including Ares I-X which is scheduled for April 2009.
Synthetic mixed-signal computation in living cells
Rubens, Jacob R.; Selvaggio, Gianluca; Lu, Timothy K.
2016-01-01
Living cells implement complex computations on the continuous environmental signals that they encounter. These computations involve both analogue- and digital-like processing of signals to give rise to complex developmental programs, context-dependent behaviours and homeostatic activities. In contrast to natural biological systems, synthetic biological systems have largely focused on either digital or analogue computation separately. Here we integrate analogue and digital computation to implement complex hybrid synthetic genetic programs in living cells. We present a framework for building comparator gene circuits to digitize analogue inputs based on different thresholds. We then demonstrate that comparators can be predictably composed together to build band-pass filters, ternary logic systems and multi-level analogue-to-digital converters. In addition, we interface these analogue-to-digital circuits with other digital gene circuits to enable concentration-dependent logic. We expect that this hybrid computational paradigm will enable new industrial, diagnostic and therapeutic applications with engineered cells. PMID:27255669
Mahoney, J. Matthew; Titiz, Ali S.; Hernan, Amanda E.; Scott, Rod C.
2016-01-01
Hippocampal neural systems consolidate multiple complex behaviors into memory. However, the temporal structure of neural firing supporting complex memory consolidation is unknown. Replay of hippocampal place cells during sleep supports the view that a simple repetitive behavior modifies sleep firing dynamics, but does not explain how multiple episodes could be integrated into associative networks for recollection during future cognition. Here we decode sequential firing structure within spike avalanches of all pyramidal cells recorded in sleeping rats after running in a circular track. We find that short sequences that combine into multiple long sequences capture the majority of the sequential structure during sleep, including replay of hippocampal place cells. The ensemble, however, is not optimized for maximally producing the behavior-enriched episode. Thus behavioral programming of sequential correlations occurs at the level of short-range interactions, not whole behavioral sequences and these short sequences are assembled into a large and complex milieu that could support complex memory consolidation. PMID:26866597
Cellular automata with object-oriented features for parallel molecular network modeling.
Zhu, Hao; Wu, Yinghui; Huang, Sui; Sun, Yan; Dhar, Pawan
2005-06-01
Cellular automata are an important modeling paradigm for studying the dynamics of large, parallel systems composed of multiple, interacting components. However, to model biological systems, cellular automata need to be extended beyond the large-scale parallelism and intensive communication in order to capture two fundamental properties characteristic of complex biological systems: hierarchy and heterogeneity. This paper proposes extensions to a cellular automata language, Cellang, to meet this purpose. The extended language, with object-oriented features, can be used to describe the structure and activity of parallel molecular networks within cells. Capabilities of this new programming language include object structure to define molecular programs within a cell, floating-point data type and mathematical functions to perform quantitative computation, message passing capability to describe molecular interactions, as well as new operators, statements, and built-in functions. We discuss relevant programming issues of these features, including the object-oriented description of molecular interactions with molecule encapsulation, message passing, and the description of heterogeneity and anisotropy at the cell and molecule levels. By enabling the integration of modeling at the molecular level with system behavior at cell, tissue, organ, or even organism levels, the program will help improve our understanding of how complex and dynamic biological activities are generated and controlled by parallel functioning of molecular networks. Index Terms-Cellular automata, modeling, molecular network, object-oriented.
Simulator for multilevel optimization research
NASA Technical Reports Server (NTRS)
Padula, S. L.; Young, K. C.
1986-01-01
A computer program designed to simulate and improve multilevel optimization techniques is described. By using simple analytic functions to represent complex engineering analyses, the simulator can generate and test a large variety of multilevel decomposition strategies in a relatively short time. This type of research is an essential step toward routine optimization of large aerospace systems. The paper discusses the types of optimization problems handled by the simulator and gives input and output listings and plots for a sample problem. It also describes multilevel implementation techniques which have value beyond the present computer program. Thus, this document serves as a user's manual for the simulator and as a guide for building future multilevel optimization applications.
NASA's Space Launch System (SLS) Program: Mars Program Utilization
NASA Technical Reports Server (NTRS)
May, Todd A.; Creech, Stephen D.
2012-01-01
NASA's Space Launch System is being designed for safe, affordable, and sustainable human and scientific exploration missions beyond Earth's orbit (BEO), as directed by the NASA Authorization Act of 2010 and NASA's 2011 Strategic Plan. This paper describes how the SLS can dramatically change the Mars program's science and human exploration capabilities and objectives. Specifically, through its high-velocity change (delta V) and payload capabilities, SLS enables Mars science missions of unprecedented size and scope. By providing direct trajectories to Mars, SLS eliminates the need for complicated gravity-assist missions around other bodies in the solar system, reducing mission time, complexity, and cost. SLS's large payload capacity also allows for larger, more capable spacecraft or landers with more instruments, which can eliminate the need for complex packaging or "folding" mechanisms. By offering this capability, SLS can enable more science to be done more quickly than would be possible through other delivery mechanisms using longer mission times.
NASA's planetary protection program as an astrobiology teaching module
NASA Astrophysics Data System (ADS)
Kolb, Vera M.
2005-09-01
We are currently developing a teaching module on the NASA's Planetary Protection Program for UW-Parkside SENCER courses. SENCER stands for Science Education for New Civic Engagements and Responsibility. It is a national initiative of the National Science Foundation (NSF), now in its fifth year, to improve science education by teaching basic sciences through the complex public issues of the 21st century. The Planetary Protection Program is one such complex public issue. Teaching astrobiology and the NASA's goals via the Planetary Protection module within the SENCER courses seems to be a good formula to reach large number of students in an interesting and innovative way. We shall describe the module that we are developing. It will be launched on our web site titled "Astrobiology at Parkside" (http://oldweb.uwp.edu/academic/chemistry/kolb/organic_chemistry/, or go to Google and then to Vera Kolb Home Page), and thus will be available for teaching to all interested parties.
Improvement of Automated POST Case Success Rate Using Support Vector Machines
NASA Technical Reports Server (NTRS)
Zwack, Mathew R.; Dees, Patrick D.
2017-01-01
During early conceptual design of complex systems, concept down selection can have a large impact upon program life-cycle cost. Therefore, any concepts selected during early design will inherently commit program costs and affect the overall probability of program success. For this reason it is important to consider as large a design space as possible in order to better inform the down selection process. For conceptual design of launch vehicles, trajectory analysis and optimization often presents the largest obstacle to evaluating large trade spaces. This is due to the sensitivity of the trajectory discipline to changes in all other aspects of the vehicle design. Small deltas in the performance of other subsystems can result in relatively large fluctuations in the ascent trajectory because the solution space is non-linear and multi-modal. In order to help capture large design spaces for new launch vehicles, the authors have performed previous work seeking to automate the execution of the industry standard tool, Program to Optimize Simulated Trajectories (POST). This work initially focused on implementation of analyst heuristics to enable closure of cases in an automated fashion, with the goal of applying the concepts of design of experiments (DOE) and surrogate modeling to enable near instantaneous throughput of vehicle cases.3 As noted in [4] work was then completed to improve the DOE process by utilizing a graph theory based approach to connect similar design points.
The T.M.R. Data Dictionary: A Management Tool for Data Base Design
Ostrowski, Maureen; Bernes, Marshall R.
1984-01-01
In January 1981, a dictionary-driven ambulatory care information system known as TMR (The Medical Record) was installed at a large private medical group practice in Los Angeles. TMR's data dictionary has enabled the medical group to adapt the software to meet changing user needs largely without programming support. For top management, the dictionary is also a tool for navigating through the system's complexity and assuring the integrity of management goals.
Large Crawler Crane for new lightning protection system
2007-10-25
A large crawler crane begins moving away from the turn basin at the Launch Complex 39 Area on NASA's Kennedy Space Center. The crane with its 70-foot boom will be moved to Launch Pad 39B and used to construct a new lightning protection system for the Constellation Program and Ares/Orion launches. Pad B will be the site of the first Ares vehicle launch, including Ares I-X which is scheduled for April 2009.
NASA Astrophysics Data System (ADS)
Fawzy, Wafaa M.
2010-10-01
A FORTRAN code is developed for simulation and fitting the fine structure of a planar weakly-bonded open-shell complex that consists of a diatomic radical in a Σ3 electronic state and a diatomic or a polyatomic closed-shell molecule. The program sets up the proper total Hamiltonian matrix for a given J value and takes account of electron-spin-electron-spin, electron-spin rotation interactions, and the quartic and sextic centrifugal distortion terms within the complex. Also, R-dependence of electron-spin-electron-spin and electron-spin rotation couplings are considered. The code does not take account of effects of large-amplitude internal rotation of the diatomic radical within the complex. It is assumed that the complex has a well defined equilibrium geometry so that effects of large amplitude motion are negligible. Therefore, the computer code is suitable for a near-rigid rotor. Numerical diagonalization of the matrix provides the eigenvalues and the eigenfunctions that are necessary for calculating energy levels, frequencies, relative intensities of infrared or microwave transitions, and expectation values of the quantum numbers within the complex. Goodness of all the quantum numbers, with exception of J and parity, depends on relative sizes of the product of the rotational constants and quantum numbers (i.e. BJ, CJ, and AK), electron-spin-electron-spin, and electron-spin rotation couplings, as well as the geometry of the complex. Therefore, expectation values of the quantum numbers are calculated in the eigenfunctions basis of the complex. The computational time for the least squares fits has been significantly reduced by using the Hellman-Feynman theory for calculating the derivatives. The computer code is useful for analysis of high resolution infrared and microwave spectra of a planar near-rigid weakly-bonded open-shell complex that contains a diatomic fragment in a Σ3 electronic state and a closed-shell molecule. The computer program was successfully applied to analysis and fitting the observed high resolution infrared spectra of the O 2sbnd HF/O 2sbnd DF and O 2sbnd N 2O complexes. Test input file for simulation and fitting the high resolution infrared spectrum of the O 2sbnd DF complex is provided. Program summaryProgram title: TSIG_COMP Catalogue identifier: AEGM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGM_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 10 030 No. of bytes in distributed program, including test data, etc.: 51 663 Distribution format: tar.gz Programming language: Fortran 90, free format Computer: SGI Origin 3400, workstations and PCs Operating system: Linux, UNIX and Windows (see Restrictions below) RAM: Case dependent Classification: 16.2 Nature of problem: TSIG_COMP calculates frequencies, relative intensities, and expectation values of the various quantum numbers and parities of bound states involved in allowed ro-vibrational transitions in semi-rigid planar weakly-bonded open-shell complexes. The complexes of interest contain a free radical in a Σ3 state and a closed-shell partner, where the electron-spin-electron-spin interaction, electron-spin rotation interaction, and centrifugal forces significantly modify the spectral patterns. To date, ab initio methods are incapable of taking these effects into account to provide accurate predictions for the ro-vibrational energy levels of the complexes of interest. In the TSIG_COMP program, the problem is solved by using the proper effective Hamiltonian and molecular basis set. Solution method: The program uses a Hamiltonian operator that takes into account vibration, end-over-end rotation, electron-spin-electron-spin and electron-spin rotation interactions as well as the various centrifugal distortion terms. The Hamiltonian operator and the molecular basis set are used to set up the Hamiltonian matrix in the inertial axis system of the complex of interest. Diagonalization of the Hamiltonian matrix provides the eigenvalues and the eigenfunctions for the bound ro-vibrational states. These eigenvalues and eigenfunctions are used to calculate frequencies and relative intensities of the allowed infrared or microwave transitions as well as expectation values of all the quantum numbers and parities of states involved in the transitions. The program employs the method of least squares fits to fit the observed frequencies to the calculated frequencies to provide the molecular parameters that determine the geometry of the complex of interest. Restrictions: The number of transitions and parameters included in the fits is limited to 80 parameters and 200 transitions. However, these numbers can be increased by adjusting dimensions of the arrays (not recommended). Running the program under MS windows is recommended for simulations of any number of transitions and for fitting a relatively small number of parameters and transitions (maximum 15 parameters and 82 transitions), for fitting larger number of parameters run time error may occur. Because spectra of weakly bonded complexes are recorded at low temperatures, in most of cases fittings can be performed under MS windows. Running time: Problem-dependent. The provided test input for Linux fits 82 transitions and 21 parameters, the actual run time is 62 minutes. The provided test input file for MS windows fits 82 transitions and 15 parameters; the actual runtime is 5 minutes.
Verification of Space Station Secondary Power System Stability Using Design of Experiment
NASA Technical Reports Server (NTRS)
Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce
1998-01-01
This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.
Sizing of complex structure by the integration of several different optimal design algorithms
NASA Technical Reports Server (NTRS)
Sobieszczanski, J.
1974-01-01
Practical design of large-scale structures can be accomplished with the aid of the digital computer by bringing together in one computer program algorithms of nonlinear mathematical programing and optimality criteria with weight-strength and other so-called engineering methods. Applications of this approach to aviation structures are discussed with a detailed description of how the total problem of structural sizing can be broken down into subproblems for best utilization of each algorithm and for efficient organization of the program into iterative loops. Typical results are examined for a number of examples.
Bush, Ian E.
1980-01-01
The lessons of the 70's with MIS were largely painful, often the same as those of the 60's, and were found in different phases on two continents. On examination this turns out to be true for many non-medical fields, true for systems programming, and thus a very general phenomenon. It is related to the functional complexity rather than to the sheer size of the software required, and above all to the relative neglect of human factors at all levels of software and hardware design. Simple hierarchical theory is a useful tool for analyzing complex systems and restoring the necessary dominance of common sense human factors. An example shows the very large effects of neglecting these factors on costs and benefits of MIS and their sub-systems.
Transport processes near coastal ocean outfalls
Noble, M.A.; Sherwood, C.R.; Lee, Hooi-Ling; Xu, Jie; Dartnell, P.; Robertson, G.; Martini, M.
2001-01-01
The central Southern California Bight is an urbanized coastal ocean where complex topography and largescale atmospheric and oceanographic forcing has led to numerous sediment-distribution patterns. Two large embayments, Santa Monica and San Pedro Bays, are connected by the short, very narrow shelf off the Palos Verdes peninsula. Ocean-sewage outfalls are located in the middle of Santa Monica Bay, on the Palos Verdes shelf and at the southeastern edge of San Pedro Bay. In 1992, the US Geological Survey, together with allied agencies, began a series of programs to determine the dominant processes that transport sediment and associated pollutants near the three ocean outfalls. As part of these programs, arrays of instrumented moorings that monitor currents, waves, water clarity, water density and collect resuspended materials were deployed on the continental shelf and slope information was also collected on the sediment and contaminant distributions in the region. The data and models developed for the Palos Verdes shelf suggest that the large reservoir of DDT/DDE in the coastal ocean sediments will continue to be exhumed and transported along the shelf for a long time. On the Santa Monica shelf, very large internal waves, or bores, are generated at the shelf break. The near-bottom currents associated with these waves sweep sediments and the associated contaminants from the shelf onto the continental slope. A new program underway on the San Pedro shelf will determine if water and contaminants from a nearby ocean outfall are transported to the local beaches by coastal ocean processes. The large variety of processes found that transport sediments and contaminants in this small region of the continental margin suggest that in regions with complex topography, local processes change markedly over small spatial scales. One cannot necessarily infer that the dominant transport processes will be similar even in adjacent regions.
Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas
2016-01-01
Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.
IMPETUS - Interactive MultiPhysics Environment for Unified Simulations.
Ha, Vi Q; Lykotrafitis, George
2016-12-08
We introduce IMPETUS - Interactive MultiPhysics Environment for Unified Simulations, an object oriented, easy-to-use, high performance, C++ program for three-dimensional simulations of complex physical systems that can benefit a large variety of research areas, especially in cell mechanics. The program implements cross-communication between locally interacting particles and continuum models residing in the same physical space while a network facilitates long-range particle interactions. Message Passing Interface is used for inter-processor communication for all simulations. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Platt, M. E.; Lewis, E. E.; Boehm, F.
1991-01-01
A Monte Carlo Fortran computer program was developed that uses two variance reduction techniques for computing system reliability applicable to solving very large highly reliable fault-tolerant systems. The program is consistent with the hybrid automated reliability predictor (HARP) code which employs behavioral decomposition and complex fault-error handling models. This new capability is called MC-HARP which efficiently solves reliability models with non-constant failures rates (Weibull). Common mode failure modeling is also a specialty.
Learning directed acyclic graphs from large-scale genomics data.
Nikolay, Fabio; Pesavento, Marius; Kritikos, George; Typas, Nassos
2017-09-20
In this paper, we consider the problem of learning the genetic interaction map, i.e., the topology of a directed acyclic graph (DAG) of genetic interactions from noisy double-knockout (DK) data. Based on a set of well-established biological interaction models, we detect and classify the interactions between genes. We propose a novel linear integer optimization program called the Genetic-Interactions-Detector (GENIE) to identify the complex biological dependencies among genes and to compute the DAG topology that matches the DK measurements best. Furthermore, we extend the GENIE program by incorporating genetic interaction profile (GI-profile) data to further enhance the detection performance. In addition, we propose a sequential scalability technique for large sets of genes under study, in order to provide statistically significant results for real measurement data. Finally, we show via numeric simulations that the GENIE program and the GI-profile data extended GENIE (GI-GENIE) program clearly outperform the conventional techniques and present real data results for our proposed sequential scalability technique.
Effects of Aromatherapy on Test Anxiety and Performance in College Students
ERIC Educational Resources Information Center
Dunnigan, Jocelyn Marie
2013-01-01
Test anxiety is a complex, multidimensional construct composed of cognitive, affective, and behavioral components that have been shown to negatively affect test performance. Furthermore, test anxiety is a pervasive problem in modern society largely related to the evaluative nature of educational programs, therefore meriting study of its nature,…
The Use of Illustrations in Large-Scale Science Assessment: A Comparative Study
ERIC Educational Resources Information Center
Wang, Chao
2012-01-01
This dissertation addresses the complexity of test illustrations design across cultures. More specifically, it examines how the characteristics of illustrations used in science test items vary across content areas, assessment programs, and cultural origins. It compares a total of 416 Grade 8 illustrated items from the areas of earth science, life…
Networking at Conferences: Developing Your Professional Support System
ERIC Educational Resources Information Center
Kowalsky, Michelle
2012-01-01
The complexity and scale of any large library, education, or technology conference can sometimes be overwhelming. Therefore, spending time reviewing the conference program and perusing the workshop offerings in advance can help you stay organized and make the most of your time at the event. Planning in advance will help you manage potential time…
Kevin C. Vogler; Alan A. Ager; Michelle A. Day; Michael Jennings; John D. Bailey
2015-01-01
The implementation of US federal forest restoration programs on national forests is a complex process that requires balancing diverse socioecological goals with project economics. Despite both the large geographic scope and substantial investments in restoration projects, a quantitative decision support framework to locate optimal project areas and examine...
1984-11-01
FLOW CHART Complet List of Location/Shtes Evaluation of Past Operations at LUste SNMe * ~Potential Hazard to Healh WelareS Reglaor Agencon consolidat...siting studies were also a part of this tions, soil, groundwater sampling and large complex project. analysis, and remedial concept engi- neering. Project
ERIC Educational Resources Information Center
Duis, Jennifer M.; Schafer, Laurel L.; Nussbaum, Sophia; Stewart, Jaclyn J.
2013-01-01
Learning goal (LG) identification can greatly inform curriculum, teaching, and evaluation practices. The complex laboratory course setting, however, presents unique obstacles in developing appropriate LGs. For example, in addition to the large quantity and variety of content supported in the general chemistry laboratory program, the interests of…
ERIC Educational Resources Information Center
Thoder, Vincent J.; Hesky, James G.; Cautilli, Joseph D.
2010-01-01
Children often have complex emotional and behavioral disorders (ADHD, ODD, Depression, PTSD, etc.). A large amount of research exists in the behavioral treatment of children with these disorders regarding specific behavioral problems. Much less research exists for the treatment of comprehensive problematic behaviors that these children experience…
Software Tools | Office of Cancer Clinical Proteomics Research
The CPTAC program develops new approaches to elucidate aspects of the molecular complexity of cancer made from large-scale proteogenomic datasets, and advance them toward precision medicine. Part of the CPTAC mission is to make data and tools available and accessible to the greater research community to accelerate the discovery process.
Developing a safe on-orbit cryogenic depot
NASA Technical Reports Server (NTRS)
Bahr, Nicholas J.
1992-01-01
New U.S. space initiatives will require technology to realize planned programs such as piloted lunar and Mars missions. Key to the optimal execution of such missions are high performance orbit transfer vehicles and propellant storage facilities. Large amounts of liquid hydrogen and oxygen demand a uniquely designed on-orbit cryogenic propellant depot. Because of the inherent dangers in propellant storage and handling, a comprehensive system safety program must be established. This paper shows how the myriad and complex hazards demonstrate the need for an integrated safety effort to be applied from program conception through operational use. Even though the cryogenic depot is still in the conceptual stage, many of the hazards have been identified, including fatigue due to heavy thermal loading from environmental and operating temperature extremes, micrometeoroid and/or depot ancillary equipment impact (this is an important problem due to the large surface area needed to house the large quantities of propellant), docking and maintenance hazards, and hazards associated with extended extravehicular activity. Various safety analysis techniques were presented for each program phase. Specific system safety implementation steps were also listed. Enhanced risk assessment was demonstrated through the incorporation of these methods.
A component-based software environment for visualizing large macromolecular assemblies.
Sanner, Michel F
2005-03-01
The interactive visualization of large biological assemblies poses a number of challenging problems, including the development of multiresolution representations and new interaction methods for navigating and analyzing these complex systems. An additional challenge is the development of flexible software environments that will facilitate the integration and interoperation of computational models and techniques from a wide variety of scientific disciplines. In this paper, we present a component-based software development strategy centered on the high-level, object-oriented, interpretive programming language: Python. We present several software components, discuss their integration, and describe some of their features that are relevant to the visualization of large molecular assemblies. Several examples are given to illustrate the interoperation of these software components and the integration of structural data from a variety of experimental sources. These examples illustrate how combining visual programming with component-based software development facilitates the rapid prototyping of novel visualization tools.
Lampa, Samuel; Alvarsson, Jonathan; Spjuth, Ola
2016-01-01
Predictive modelling in drug discovery is challenging to automate as it often contains multiple analysis steps and might involve cross-validation and parameter tuning that create complex dependencies between tasks. With large-scale data or when using computationally demanding modelling methods, e-infrastructures such as high-performance or cloud computing are required, adding to the existing challenges of fault-tolerant automation. Workflow management systems can aid in many of these challenges, but the currently available systems are lacking in the functionality needed to enable agile and flexible predictive modelling. We here present an approach inspired by elements of the flow-based programming paradigm, implemented as an extension of the Luigi system which we name SciLuigi. We also discuss the experiences from using the approach when modelling a large set of biochemical interactions using a shared computer cluster.Graphical abstract.
CrasyDSE: A framework for solving Dyson–Schwinger equations☆
Huber, Markus Q.; Mitter, Mario
2012-01-01
Dyson–Schwinger equations are important tools for non-perturbative analyses of quantum field theories. For example, they are very useful for investigations in quantum chromodynamics and related theories. However, sometimes progress is impeded by the complexity of the equations. Thus automating parts of the calculations will certainly be helpful in future investigations. In this article we present a framework for such an automation based on a C++ code that can deal with a large number of Green functions. Since also the creation of the expressions for the integrals of the Dyson–Schwinger equations needs to be automated, we defer this task to a Mathematica notebook. We illustrate the complete workflow with an example from Yang–Mills theory coupled to a fundamental scalar field that has been investigated recently. As a second example we calculate the propagators of pure Yang–Mills theory. Our code can serve as a basis for many further investigations where the equations are too complicated to tackle by hand. It also can easily be combined with DoFun, a program for the derivation of Dyson–Schwinger equations.1 Program summary Program title: CrasyDSE Catalogue identifier: AEMY _v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMY_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 49030 No. of bytes in distributed program, including test data, etc.: 303958 Distribution format: tar.gz Programming language: Mathematica 8 and higher, C++. Computer: All on which Mathematica and C++ are available. Operating system: All on which Mathematica and C++ are available (Windows, Unix, Mac OS). Classification: 11.1, 11.4, 11.5, 11.6. Nature of problem: Solve (large) systems of Dyson–Schwinger equations numerically. Solution method: Create C++ functions in Mathematica to be used for the numeric code in C++. This code uses structures to handle large numbers of Green functions. Unusual features: Provides a tool to convert Mathematica expressions into C++ expressions including conversion of function names. Running time: Depending on the complexity of the investigated system solving the equations numerically can take seconds on a desktop PC to hours on a cluster. PMID:25540463
Cx-02 Program, workshop on modeling complex systems
Mossotti, Victor G.; Barragan, Jo Ann; Westergard, Todd D.
2003-01-01
This publication contains the abstracts and program for the workshop on complex systems that was held on November 19-21, 2002, in Reno, Nevada. Complex systems are ubiquitous within the realm of the earth sciences. Geological systems consist of a multiplicity of linked components with nested feedback loops; the dynamics of these systems are non-linear, iterative, multi-scale, and operate far from equilibrium. That notwithstanding, It appears that, with the exception of papers on seismic studies, geology and geophysics work has been disproportionally underrepresented at regional and national meetings on complex systems relative to papers in the life sciences. This is somewhat puzzling because geologists and geophysicists are, in many ways, preadapted to thinking of complex system mechanisms. Geologists and geophysicists think about processes involving large volumes of rock below the sunlit surface of Earth, the accumulated consequence of processes extending hundreds of millions of years in the past. Not only do geologists think in the abstract by virtue of the vast time spans, most of the evidence is out-of-sight. A primary goal of this workshop is to begin to bridge the gap between the Earth sciences and life sciences through demonstration of the universality of complex systems science, both philosophically and in model structures.
Canard, Gabriel; Koeller, Sylvain; Bernardinelli, Gérald; Piguet, Claude
2008-01-23
The beneficial entropic effect, which may be expected from the connection of three tridentate binding units to a strain-free covalent tripod for complexing nine-coordinate cations (Mz+ = Ca2+, La3+, Eu3+, Lu3+), is quantitatively analyzed by using a simple thermodynamic additive model. The switch from pure intermolecular binding processes, characterizing the formation of the triple-helical complexes [M(L2)3]z+, to a combination of inter- and intramolecular complexation events in [M(L8)]z+ shows that the ideal structural fit observed in [M(L8)]z+ indeed masks large energetic constraints. This limitation is evidenced by the faint effective concentrations, ceff, which control the intramolecular ring-closing reactions operating in [M(L8)]z+. This predominence of the thermodynamic approach over the usual structural analysis agrees with the hierarchical relationships linking energetics and structures. Its simple estimation by using a single microscopic parameter, ceff, opens novel perspectives for the molecular tuning of specific receptors for the recognition of large cations, a crucial point for the programming of heterometallic f-f complexes under thermodynamic control.
NASA Astrophysics Data System (ADS)
Henkel, Daniela; Eisenhauer, Anton
2017-04-01
During the last decades, the number of large research projects has increased and therewith the requirement for multidisciplinary, multisectoral collaboration. Such complex and large-scale projects pose new competencies to form, manage, and use large, diverse teams as a competitive advantage. For complex projects the effort is magnified because multiple large international research consortia involving academic and non-academic partners, including big industries, NGOs, private and public bodies, all with cultural differences, individually discrepant expectations on teamwork and differences in the collaboration between national and multi-national administrations and research organisations, challenge the organisation and management of such multi-partner research consortia. How many partners are needed to establish and conduct collaboration with a multidisciplinary and multisectoral approach? How much personnel effort and what kinds of management techniques are required for such projects. This presentation identifies advantages and challenges of large research projects based on the experiences made in the context of an Innovative Training Network (ITN) project within Marie Skłodowska-Curie Actions of the European HORIZON 2020 program. Possible strategies are discussed to circumvent and avoid conflicts already at the beginning of the project.
SimGen: A General Simulation Method for Large Systems.
Taylor, William R
2017-02-03
SimGen is a stand-alone computer program that reads a script of commands to represent complex macromolecules, including proteins and nucleic acids, in a structural hierarchy that can then be viewed using an integral graphical viewer or animated through a high-level application programming interface in C++. Structural levels in the hierarchy range from α-carbon or phosphate backbones through secondary structure to domains, molecules, and multimers with each level represented in an identical data structure that can be manipulated using the application programming interface. Unlike most coarse-grained simulation approaches, the higher-level objects represented in SimGen can be soft, allowing the lower-level objects that they contain to interact directly. The default motion simulated by SimGen is a Brownian-like diffusion that can be set to occur across all levels of representation in the hierarchy. Links can also be defined between objects, which, when combined with large high-level random movements, result in an effective search strategy for constraint satisfaction, including structure prediction from predicted pairwise distances. The implementation of SimGen makes use of the hierarchic data structure to avoid unnecessary calculation, especially for collision detection, allowing it to be simultaneously run and viewed on a laptop computer while simulating large systems of over 20,000 objects. It has been used previously to model complex molecular interactions including the motion of a myosin-V dimer "walking" on an actin fibre, RNA stem-loop packing, and the simulation of cell motion and aggregation. Several extensions to this original functionality are described. Copyright © 2016 The Francis Crick Institute. Published by Elsevier Ltd.. All rights reserved.
Developing science gateways for drug discovery in a grid environment.
Pérez-Sánchez, Horacio; Rezaei, Vahid; Mezhuyev, Vitaliy; Man, Duhu; Peña-García, Jorge; den-Haan, Helena; Gesing, Sandra
2016-01-01
Methods for in silico screening of large databases of molecules increasingly complement and replace experimental techniques to discover novel compounds to combat diseases. As these techniques become more complex and computationally costly we are faced with an increasing problem to provide the research community of life sciences with a convenient tool for high-throughput virtual screening on distributed computing resources. To this end, we recently integrated the biophysics-based drug-screening program FlexScreen into a service, applicable for large-scale parallel screening and reusable in the context of scientific workflows. Our implementation is based on Pipeline Pilot and Simple Object Access Protocol and provides an easy-to-use graphical user interface to construct complex workflows, which can be executed on distributed computing resources, thus accelerating the throughput by several orders of magnitude.
Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...
2008-01-01
Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less
Decision-Making Rationales among Quebec VET Student Aged 25 and Older
ERIC Educational Resources Information Center
Cournoyer, Louis; Deschenaux, Frédéric
2017-01-01
Each year, a large number of students aged 25 years and over take part in vocational and education training (VET) programs in the Province of Quebec, Canada. The life experiences of many of these adults are marked by complex psychosocial and professional events, which may have influenced their career decision-making processes. This paper aimed to…
Software engineering as an engineering discipline
NASA Technical Reports Server (NTRS)
Freedman, Glenn B.
1988-01-01
The purpose of this panel is to explore the emerging field of software engineering from a variety of perspectives: university programs; industry training and definition; government development; and technology transfer. In doing this, the panel will address the issues of distinctions among software engineering, computer science, and computer hardware engineering as they relate to the challenges of large, complex systems.
ERIC Educational Resources Information Center
Vajravelu, Kuppalapalle; Muhs, Tammy
2016-01-01
Successful science and engineering programs require proficiency and dynamics in mathematics classes to enhance the learning of complex subject matter with a sufficient amount of practical problem solving. Improving student performance and retention in mathematics classes requires inventive approaches. At the University of Central Florida (UCF) the…
ERIC Educational Resources Information Center
Clinton, Gregory; Rieber, Lloyd P.
2010-01-01
The Studio curriculum in the Learning, Design, and Technology (formerly Instructional Technology) program at a large research-extensive university in the southeastern U.S. represents a deliberate application of contemporary theory of how adults learn complex information in ill-structured domains. The Studio curriculum, part of a graduate program…
ERIC Educational Resources Information Center
New York State Education Dept., Albany. Bureau of Secondary Curriculum Development.
Designed to prepare students to be engine mechanics working on automotive and large stationary diesel engines, this instructor's guide contains eight units arranged from simple to complex to facilitate student learning. Each contains behavioral objectives, a content outline, understandings and teaching approaches necessary to develop the content,…
ERIC Educational Resources Information Center
Feldmann, Richard J.; And Others
1972-01-01
Computer graphics provides a valuable tool for the representation and a better understanding of structures, both small and large. Accurate and rapid construction, manipulation, and plotting of structures, such as macromolecules as complex as hemoglobin, are performed by a collection of computer programs and a time-sharing computer. (21 references)…
Measuring School Performance To Improve Student Achievement and To Reward Effective Programs.
ERIC Educational Resources Information Center
Heistad, Dave; Spicuzza, Rick
This paper describes the method that the Minneapolis Public School system (MPS), Minnesota, uses to measure school and student performance. MPS uses a multifaceted system that both captures and accounts for the complexity of a large urban school district. The system incorporates: (1) a hybrid model of critical indicators that report on level of…
The National Cancer Institute's (NCI) Clinical Proteomic Technologies for Cancer (CPTC) initiative at the National Institutes of Health has entered into a memorandum of understanding (MOU) with the Korea Institute of Science and Technology (KIST). This MOU promotes proteomic technology optimization and standards implementation in large-scale international programs.
The Meaning of School from Dropout's View Point (A Phenomenological Study)
ERIC Educational Resources Information Center
Habibi; Setiawan, Cally
2017-01-01
Student dropouts are complex problems in Indonesia. Some of the dropouts living in rural areas have migrated to the large cities. It contributes to the child labor growth which is already one the major problems in Indonesia. Knowledge about the meaning of school from their perspective could be helpful for policy and programs related to dropout…
Divide and Recombine for Large Complex Data
2017-12-01
Empirical Methods in Natural Language Processing , October 2014 Keywords Enter keywords for the publication. URL Enter the URL...low-latency data processing systems. Declarative Languages for Interactive Visualization: The Reactive Vega Stack Another thread of XDATA research...for array processing operations embedded in the R programming language . Vector virtual machines work well for long vectors. One of the most
The Emergence and Unfolding of Telemonitoring Practices in Different Healthcare Organizations
2018-01-01
Telemonitoring, a sub-category of telemedicine, is promoted as a solution to meet the challenges in Western healthcare systems in terms of an increasing population of people with chronic conditions and fragmentation issues. Recent findings from large-scale telemonitoring programs reveal that these promises are difficult to meet in complex real-life settings which may be explained by concentrating on the practices that emerge when telemonitoring is used to treat patients with chronic conditions. This paper explores the emergence and unfolding of telemonitoring practices in relation to a large-scale, inter-organizational home telemonitoring program which involved 5 local health centers, 10 district nurse units, four hospitals, and 225 general practice clinics in Denmark. Twenty-eight interviews and 28 h of observations of health professionals and administrative staff were conducted over a 12-month period from 2014 to 2015. This study’s findings reveal how telemonitoring practices emerged and unfolded differently among various healthcare organizations. This study suggests that the emergence and unfolding of novel practices is the result of complex interplay between existing work practices, alterations of core tasks, inscriptions in the technology, and the power to either adopt or ignore such novel practices. The study enhances our understanding of how novel technology like telemonitoring impacts various types of healthcare organizations when implemented in a complex inter-organizational context. PMID:29301384
Ad Hoc modeling, expert problem solving, and R&T program evaluation
NASA Technical Reports Server (NTRS)
Silverman, B. G.; Liebowitz, J.; Moustakis, V. S.
1983-01-01
A simplified cost and time (SCAT) analysis program utilizing personal-computer technology is presented and demonstrated in the case of the NASA-Goddard end-to-end data system. The difficulties encountered in implementing complex program-selection and evaluation models in the research and technology field are outlined. The prototype SCAT system described here is designed to allow user-friendly ad hoc modeling in real time and at low cost. A worksheet constructed on the computer screen displays the critical parameters and shows how each is affected when one is altered experimentally. In the NASA case, satellite data-output and control requirements, ground-facility data-handling capabilities, and project priorities are intricately interrelated. Scenario studies of the effects of spacecraft phaseout or new spacecraft on throughput and delay parameters are shown. The use of a network of personal computers for higher-level coordination of decision-making processes is suggested, as a complement or alternative to complex large-scale modeling.
Applications of artificial intelligence to mission planning
NASA Technical Reports Server (NTRS)
Ford, Donnie R.; Rogers, John S.; Floyd, Stephen A.
1990-01-01
The scheduling problem facing NASA-Marshall mission planning is extremely difficult for several reasons. The most critical factor is the computational complexity involved in developing a schedule. The size of the search space is large along some dimensions and infinite along others. It is because of this and other difficulties that many of the conventional operation research techniques are not feasible or inadequate to solve the problems by themselves. Therefore, the purpose is to examine various artificial intelligence (AI) techniques to assist conventional techniques or to replace them. The specific tasks performed were as follows: (1) to identify mission planning applications for object oriented and rule based programming; (2) to investigate interfacing AI dedicated hardware (Lisp machines) to VAX hardware; (3) to demonstrate how Lisp may be called from within FORTRAN programs; (4) to investigate and report on programming techniques used in some commercial AI shells, such as Knowledge Engineering Environment (KEE); and (5) to study and report on algorithmic methods to reduce complexity as related to AI techniques.
CrasyDSE: A framework for solving Dyson-Schwinger equations.
Huber, Markus Q; Mitter, Mario
2012-11-01
Dyson-Schwinger equations are important tools for non-perturbative analyses of quantum field theories. For example, they are very useful for investigations in quantum chromodynamics and related theories. However, sometimes progress is impeded by the complexity of the equations. Thus automating parts of the calculations will certainly be helpful in future investigations. In this article we present a framework for such an automation based on a C++ code that can deal with a large number of Green functions. Since also the creation of the expressions for the integrals of the Dyson-Schwinger equations needs to be automated, we defer this task to a Mathematica notebook. We illustrate the complete workflow with an example from Yang-Mills theory coupled to a fundamental scalar field that has been investigated recently. As a second example we calculate the propagators of pure Yang-Mills theory. Our code can serve as a basis for many further investigations where the equations are too complicated to tackle by hand. It also can easily be combined with DoFun , a program for the derivation of Dyson-Schwinger equations. Program title : CrasyDSE Catalogue identifier : AEMY _v1_0 Program summary URL : http://cpc.cs.qub.ac.uk/summaries/AEMY_v1_0.html Program obtainable from : CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions : Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc. : 49030 No. of bytes in distributed program, including test data, etc. : 303958 Distribution format : tar.gz Programming language : Mathematica 8 and higher, C++ . Computer : All on which Mathematica and C++ are available. Operating system : All on which Mathematica and C++ are available (Windows, Unix, Mac OS). Classification : 11.1, 11.4, 11.5, 11.6. Nature of problem : Solve (large) systems of Dyson-Schwinger equations numerically. Solution method : Create C++ functions in Mathematica to be used for the numeric code in C++ . This code uses structures to handle large numbers of Green functions. Unusual features : Provides a tool to convert Mathematica expressions into C++ expressions including conversion of function names. Running time : Depending on the complexity of the investigated system solving the equations numerically can take seconds on a desktop PC to hours on a cluster.
Sensitivity of Precipitation in Coupled Land-Atmosphere Models
NASA Technical Reports Server (NTRS)
Neelin, David; Zeng, N.; Suarez, M.; Koster, R.
2004-01-01
The project objective was to understand mechanisms by which atmosphere-land-ocean processes impact precipitation in the mean climate and interannual variations, focusing on tropical and subtropical regions. A combination of modeling tools was used: an intermediate complexity land-atmosphere model developed at UCLA known as the QTCM and the NASA Seasonal-to-Interannual Prediction Program general circulation model (NSIPP GCM). The intermediate complexity model was used to develop hypotheses regarding the physical mechanisms and theory for the interplay of large-scale dynamics, convective heating, cloud radiative effects and land surface feedbacks. The theoretical developments were to be confronted with diagnostics from the more complex GCM to validate or modify the theory.
Polyglot Programming in Applications Used for Genetic Data Analysis
Nowak, Robert M.
2014-01-01
Applications used for the analysis of genetic data process large volumes of data with complex algorithms. High performance, flexibility, and a user interface with a web browser are required by these solutions, which can be achieved by using multiple programming languages. In this study, I developed a freely available framework for building software to analyze genetic data, which uses C++, Python, JavaScript, and several libraries. This system was used to build a number of genetic data processing applications and it reduced the time and costs of development. PMID:25197633
Polyglot programming in applications used for genetic data analysis.
Nowak, Robert M
2014-01-01
Applications used for the analysis of genetic data process large volumes of data with complex algorithms. High performance, flexibility, and a user interface with a web browser are required by these solutions, which can be achieved by using multiple programming languages. In this study, I developed a freely available framework for building software to analyze genetic data, which uses C++, Python, JavaScript, and several libraries. This system was used to build a number of genetic data processing applications and it reduced the time and costs of development.
2013-12-01
DisasterRecoveryExpenditure/Pag es/default.aspx, Canadian Disaster Database, and www.fema.gov) 116 Table 15. Comparison of declaration criteria and disasters for $30 million...the role of insurance in FEMA’s Public Assistance program. The guidance provided in the 44 CFR has not kept up with the industry since being...the nation. xxix THIS PAGE INTENTIONALLY LEFT BLANK I. INTRODUCTION Insurance is a complex industry , which is a large component of the U.S
Veazey, Kylee J; Muller, Daria; Golding, Michael C
2013-01-01
Exposure to alcohol significantly alters the developmental trajectory of progenitor cells and fundamentally compromises tissue formation (i.e., histogenesis). Emerging research suggests that ethanol can impair mammalian development by interfering with the execution of molecular programs governing differentiation. For example, ethanol exposure disrupts cellular migration, changes cell-cell interactions, and alters growth factor signaling pathways. Additionally, ethanol can alter epigenetic mechanisms controlling gene expression. Normally, lineage-specific regulatory factors (i.e., transcription factors) establish the transcriptional networks of each new cell type; the cell's identity then is maintained through epigenetic alterations in the way in which the DNA encoding each gene becomes packaged within the chromatin. Ethanol exposure can induce epigenetic changes that do not induce genetic mutations but nonetheless alter the course of fetal development and result in a large array of patterning defects. Two crucial enzyme complexes--the Polycomb and Trithorax proteins--are central to the epigenetic programs controlling the intricate balance between self-renewal and the execution of cellular differentiation, with diametrically opposed functions. Prenatal ethanol exposure may disrupt the functions of these two enzyme complexes, altering a crucial aspect of mammalian differentiation. Characterizing the involvement of Polycomb and Trithorax group complexes in the etiology of fetal alcohol spectrum disorders will undoubtedly enhance understanding of the role that epigenetic programming plays in this complex disorder.
The critical role of social workers in home-based primary care.
Reckrey, Jennifer M; Gettenberg, Gabrielle; Ross, Helena; Kopke, Victoria; Soriano, Theresa; Ornstein, Katherine
2014-01-01
The growing homebound population has many complex biomedical and psychosocial needs and requires a team-based approach to care (Smith, Ornstein, Soriano, Muller, & Boal, 2006). The Mount Sinai Visiting Doctors Program (MSVD), a large interdisciplinary home-based primary care program in New York City, has a vibrant social work program that is integrated into the routine care of homebound patients. We describe the assessment process used by MSVD social workers, highlight examples of successful social work care, and discuss why social workers' individualized care plans are essential for keeping patients with chronic illness living safely in the community. Despite barriers to widespread implementation, such social work involvement within similar home-based clinical programs is essential in the interdisciplinary care of our most needy patients.
Automatic differential analysis of NMR experiments in complex samples.
Margueritte, Laure; Markov, Petar; Chiron, Lionel; Starck, Jean-Philippe; Vonthron-Sénécheau, Catherine; Bourjot, Mélanie; Delsuc, Marc-André
2018-06-01
Liquid state nuclear magnetic resonance (NMR) is a powerful tool for the analysis of complex mixtures of unknown molecules. This capacity has been used in many analytical approaches: metabolomics, identification of active compounds in natural extracts, and characterization of species, and such studies require the acquisition of many diverse NMR measurements on series of samples. Although acquisition can easily be performed automatically, the number of NMR experiments involved in these studies increases very rapidly, and this data avalanche requires to resort to automatic processing and analysis. We present here a program that allows the autonomous, unsupervised processing of a large corpus of 1D, 2D, and diffusion-ordered spectroscopy experiments from a series of samples acquired in different conditions. The program provides all the signal processing steps, as well as peak-picking and bucketing of 1D and 2D spectra, the program and its components are fully available. In an experiment mimicking the search of a bioactive species in a natural extract, we use it for the automatic detection of small amounts of artemisinin added to a series of plant extracts and for the generation of the spectral fingerprint of this molecule. This program called Plasmodesma is a novel tool that should be useful to decipher complex mixtures, particularly in the discovery of biologically active natural products from plants extracts but can also in drug discovery or metabolomics studies. Copyright © 2017 John Wiley & Sons, Ltd.
Skivington, Kathryn; Lifshen, Marni; Mustard, Cameron
2016-11-22
Comprehensive workplace return-to-work policies, applied with consistency, can reduce length of time out of work and the risk of long-term disability. This paper reports on the findings from a qualitative study exploring managers' and return-to-work-coordinators' views on the implementation of their organization's new return-to-work program. To provide practical guidance to organizations in designing and implementing return-to-work programs for their employees. Semi-structured qualitative interviews were undertaken with 20 managers and 10 return-to-work co-ordinators to describe participants' perspectives on the progress of program implementation in the first 18 months of adoption. The study was based in a large healthcare organization in Ontario, Canada. Thematic analysis of the data was conducted. We identified tensions evident in the early implementation phase of the organization's return-to-work program. These tensions were attributed to uncertainties concerning roles and responsibilities and to circumstances where objectives or principles appeared to be in conflict. The implementation of a comprehensive and collaborative return-to-work program is a complex challenge. The findings described in this paper may provide helpful guidance for organizations embarking on the development and implementation of a return-to-work program.
Skivington, Kathryn; Lifshen, Marni; Mustard, Cameron
2016-01-01
BACKGROUND: Comprehensive workplace return-to-work policies, applied with consistency, can reduce length of time out of work and the risk of long-term disability. This paper reports on the findings from a qualitative study exploring managers’ and return-to-work-coordinators’ views on the implementation of their organization’s new return-to-work program. OBJECTIVES: To provide practical guidance to organizations in designing and implementing return-to-work programs for their employees. METHODS: Semi-structured qualitative interviews were undertaken with 20 managers and 10 return-to-work co-ordinators to describe participants’ perspectives on the progress of program implementation in the first 18 months of adoption. The study was based in a large healthcare organization in Ontario, Canada. Thematic analysis of the data was conducted. RESULTS: We identified tensions evident in the early implementation phase of the organization’s return-to-work program. These tensions were attributed to uncertainties concerning roles and responsibilities and to circumstances where objectives or principles appeared to be in conflict. CONCLUSIONS: The implementation of a comprehensive and collaborative return-to-work program is a complex challenge. The findings described in this paper may provide helpful guidance for organizations embarking on the development and implementation of a return-to-work program. PMID:27792035
Wang, Lihong; Gong, Zaiwu
2017-10-10
As meteorological disaster systems are large complex systems, disaster reduction programs must be based on risk analysis. Consequently, judgment by an expert based on his or her experience (also known as qualitative evaluation) is an important link in meteorological disaster risk assessment. In some complex and non-procedural meteorological disaster risk assessments, a hesitant fuzzy linguistic preference relation (HFLPR) is often used to deal with a situation in which experts may be hesitant while providing preference information of a pairwise comparison of alternatives, that is, the degree of preference of one alternative over another. This study explores hesitation from the perspective of statistical distributions, and obtains an optimal ranking of an HFLPR based on chance-restricted programming, which provides a new approach for hesitant fuzzy optimisation of decision-making in meteorological disaster risk assessments.
NASA Technical Reports Server (NTRS)
Johnson, Sally C.; Boerschlein, David P.
1995-01-01
Semi-Markov models can be used to analyze the reliability of virtually any fault-tolerant system. However, the process of delineating all the states and transitions in a complex system model can be devastatingly tedious and error prone. The Abstract Semi-Markov Specification Interface to the SURE Tool (ASSIST) computer program allows the user to describe the semi-Markov model in a high-level language. Instead of listing the individual model states, the user specifies the rules governing the behavior of the system, and these are used to generate the model automatically. A few statements in the abstract language can describe a very large, complex model. Because no assumptions are made about the system being modeled, ASSIST can be used to generate models describing the behavior of any system. The ASSIST program and its input language are described and illustrated by examples.
Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas
2017-01-01
Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948
What drives the formation of massive stars and clusters?
NASA Astrophysics Data System (ADS)
Ochsendorf, Bram; Meixner, Margaret; Roman-Duval, Julia; Evans, Neal J., II; Rahman, Mubdi; Zinnecker, Hans; Nayak, Omnarayani; Bally, John; Jones, Olivia C.; Indebetouw, Remy
2018-01-01
Galaxy-wide surveys allow to study star formation in unprecedented ways. In this talk, I will discuss our analysis of the Large Magellanic Cloud (LMC) and the Milky Way, and illustrate how studying both the large and small scale structure of galaxies are critical in addressing the question: what drives the formation of massive stars and clusters?I will show that ‘turbulence-regulated’ star formation models do not reproduce massive star formation properties of GMCs in the LMC and Milky Way: this suggests that theory currently does not capture the full complexity of star formation on small scales. I will also report on the discovery of a massive star forming complex in the LMC, which in many ways manifests itself as an embedded twin of 30 Doradus: this may shed light on the formation of R136 and 'Super Star Clusters' in general. Finally, I will highlight what we can expect in the next years in the field of star formation with large-scale sky surveys, ALMA, and our JWST-GTO program.
LISP as an Environment for Software Design: Powerful and Perspicuous
Blum, Robert L.; Walker, Michael G.
1986-01-01
The LISP language provides a useful set of features for prototyping knowledge-intensive, clinical applications software that is not found In most other programing environments. Medical computer programs that need large medical knowledge bases, such as programs for diagnosis, therapeutic consultation, education, simulation, and peer review, are hard to design, evolve continually, and often require major revisions. They necessitate an efficient and flexible program development environment. The LISP language and programming environments bullt around it are well suited for program prototyping. The lingua franca of artifical intelligence researchers, LISP facllitates bullding complex systems because it is simple yet powerful. Because of its simplicity, LISP programs can read, execute, modify and even compose other LISP programs at run time. Hence, it has been easy for system developers to create programming tools that greatly speed the program development process, and that may be easily extended by users. This has resulted in the creation of many useful graphical interfaces, editors, and debuggers, which facllitate the development of knowledge-intensive medical applications.
Family Structure and Child Well-Being: Integrating Family Complexity
Brown, Susan L.; Manning, Wendy D.; Stykes, J. Bart
2014-01-01
Although children’s family lives are diverse, the measurement of children’s living arrangements has lagged, focusing on the relationships of children to parents while largely ignoring sibling composition. Using data from the 2008 Survey of Income and Program Participation (N = 23,985) the authors documented patterns of family complexity among a nationally representative sample of children ages 0–17 living in a range of family structures. They also examined the independent and joint associations of family structure and family complexity on child economic well-being. Family complexity was independently related to economic disadvantage, namely, a lower income-to-needs ratio and a higher likelihood of public assistance receipt. The role of family complexity was partially contingent on family structure, with the positive association between family complexity and receipt of public assistance more pronounced for children in families with 2 married biological parents. This study demonstrates the utility of integrating family structure and family complexity in studies of children’s well-being. PMID:25620810
1977-05-01
RADC-TR-77-168 Final Technical Report M~ay 1977 ll VIKING SOF11WARE DATA Martin Marietta Corporation Approved for public release; distribution...practices employed on a large and complex system development by the Martin Marietta Corporation. The intent of the RADC program to which this document...the technique are described. The preparation of this report could not have been accomplished without considerable assistance from fellow Martin
Work for Play: Careers in Video Game Development
ERIC Educational Resources Information Center
Liming, Drew; Vilorio, Dennis
2011-01-01
Video games are not only for play; they also provide work. Making video games is a serious--and big--business. Creating these games is complex and requires the collaboration of many developers, who perform a variety of tasks, from production to programming. They work for both small and large game studios to create games that can be played on many…
ERIC Educational Resources Information Center
Longford, Nicholas T.
Large scale surveys usually employ a complex sampling design and as a consequence, no standard methods for estimation of the standard errors associated with the estimates of population means are available. Resampling methods, such as jackknife or bootstrap, are often used, with reference to their properties of robustness and reduction of bias. A…
1988-06-22
Moog Incorporated East Aurora, New York 14052-0013 ABSTRACT The goals of U.S. space programs have created a need for large, complex, long- life ...Bernard Schroer, Uiversity of Alabaw in Wntsville 64 A Robotic Vehicle Global Route Planner for the 1990s William J. Pollard KMS Fusion Inc Ann
Environmental projects. Volume 2: Underground storage tanks compliance program
NASA Technical Reports Server (NTRS)
Kushner, L.
1987-01-01
Six large parabolic dish antennas are located at the Goldstone Deep Space Communications Complex north of Barstow, California. As a large-scale facility located in a remote, isolated desert region, the GDSCC operations require numerous on-site storage facilities for gasoline, diesel and hydraulic oil. These essential fluids are stored in underground storage tanks (USTs). Because USTs may develop leaks with the resultant seepage of their hazardous contents into the surrounding soil, local, State and Federal authorities have adopted stringent regulations for the testing and maintenance of USTs. Under the supervision of JPL's Office of Telecommunications and Data Acquisition, a year-long program has brought 27 USTs at the Goldstone Complex into compliance with Federal, State of California and County of San Bernadino regulations. Of these 27 USTs, 15 are operating today, 11 have been temporary closed down, and 1 abandoned in place. In 1989, the 15 USTs now operating at the Goldstone DSCC will be replaced either by modern, double-walled USTs equipped with automatic sensors for leak detection, or by above ground storage tanks. The 11 inactivated USTs are to be excavated, removed and disposed of according to regulation.
A measurement system for large, complex software programs
NASA Technical Reports Server (NTRS)
Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.
1994-01-01
This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.
Strategies for Ground Based Testing of Manned Lunar Surface Systems
NASA Technical Reports Server (NTRS)
Beyer, Jeff; Peacock, Mike; Gill, Tracy
2009-01-01
Integrated testing (such as Multi-Element Integrated Test (MEIT)) is critical to reducing risks and minimizing problems encountered during assembly, activation, and on-orbit operation of large, complex manned spacecraft. Provides the best implementation of "Test Like You Fly:. Planning for integrated testing needs to begin at the earliest stages of Program definition. Program leadership needs to fully understand and buy in to what integrated testing is and why it needs to be performed. As Program evolves and design and schedules mature, continually look for suitable opportunities to perform testing where enough components are together in one place at one time. The benefits to be gained are well worth the costs.
CrasyDSE: A framework for solving Dyson-Schwinger equations
NASA Astrophysics Data System (ADS)
Huber, Markus Q.; Mitter, Mario
2012-11-01
Dyson-Schwinger equations are important tools for non-perturbative analyses of quantum field theories. For example, they are very useful for investigations in quantum chromodynamics and related theories. However, sometimes progress is impeded by the complexity of the equations. Thus automating parts of the calculations will certainly be helpful in future investigations. In this article we present a framework for such an automation based on a C++ code that can deal with a large number of Green functions. Since also the creation of the expressions for the integrals of the Dyson-Schwinger equations needs to be automated, we defer this task to a Mathematica notebook. We illustrate the complete workflow with an example from Yang-Mills theory coupled to a fundamental scalar field that has been investigated recently. As a second example we calculate the propagators of pure Yang-Mills theory. Our code can serve as a basis for many further investigations where the equations are too complicated to tackle by hand. It also can easily be combined with DoFun, a program for the derivation of Dyson-Schwinger equations.
Topology and Control of the Cell-Cycle-Regulated Transcriptional Circuitry
Haase, Steven B.; Wittenberg, Curt
2014-01-01
Nearly 20% of the budding yeast genome is transcribed periodically during the cell division cycle. The precise temporal execution of this large transcriptional program is controlled by a large interacting network of transcriptional regulators, kinases, and ubiquitin ligases. Historically, this network has been viewed as a collection of four coregulated gene clusters that are associated with each phase of the cell cycle. Although the broad outlines of these gene clusters were described nearly 20 years ago, new technologies have enabled major advances in our understanding of the genes comprising those clusters, their regulation, and the complex regulatory interplay between clusters. More recently, advances are being made in understanding the roles of chromatin in the control of the transcriptional program. We are also beginning to discover important regulatory interactions between the cell-cycle transcriptional program and other cell-cycle regulatory mechanisms such as checkpoints and metabolic networks. Here we review recent advances and contemporary models of the transcriptional network and consider these models in the context of eukaryotic cell-cycle controls. PMID:24395825
Prenatal Alcohol Exposure and Cellular Differentiation
Veazey, Kylee J.; Muller, Daria; Golding, Michael C.
2013-01-01
Exposure to alcohol significantly alters the developmental trajectory of progenitor cells and fundamentally compromises tissue formation (i.e., histogenesis). Emerging research suggests that ethanol can impair mammalian development by interfering with the execution of molecular programs governing differentiation. For example, ethanol exposure disrupts cellular migration, changes cell–cell interactions, and alters growth factor signaling pathways. Additionally, ethanol can alter epigenetic mechanisms controlling gene expression. Normally, lineage-specific regulatory factors (i.e., transcription factors) establish the transcriptional networks of each new cell type; the cell’s identity then is maintained through epigenetic alterations in the way in which the DNA encoding each gene becomes packaged within the chromatin. Ethanol exposure can induce epigenetic changes that do not induce genetic mutations but nonetheless alter the course of fetal development and result in a large array of patterning defects. Two crucial enzyme complexes—the Polycomb and Trithorax proteins—are central to the epigenetic programs controlling the intricate balance between self-renewal and the execution of cellular differentiation, with diametrically opposed functions. Prenatal ethanol exposure may disrupt the functions of these two enzyme complexes, altering a crucial aspect of mammalian differentiation. Characterizing the involvement of Polycomb and Trithorax group complexes in the etiology of fetal alcohol spectrum disorders will undoubtedly enhance understanding of the role that epigenetic programming plays in this complex disorder. PMID:24313167
Computational complexity of Boolean functions
NASA Astrophysics Data System (ADS)
Korshunov, Aleksei D.
2012-02-01
Boolean functions are among the fundamental objects of discrete mathematics, especially in those of its subdisciplines which fall under mathematical logic and mathematical cybernetics. The language of Boolean functions is convenient for describing the operation of many discrete systems such as contact networks, Boolean circuits, branching programs, and some others. An important parameter of discrete systems of this kind is their complexity. This characteristic has been actively investigated starting from Shannon's works. There is a large body of scientific literature presenting many fundamental results. The purpose of this survey is to give an account of the main results over the last sixty years related to the complexity of computation (realization) of Boolean functions by contact networks, Boolean circuits, and Boolean circuits without branching. Bibliography: 165 titles.
A Case Study of the De Novo Evolution of a Complex Odometric Behavior in Digital Organisms
Grabowski, Laura M.; Bryson, David M.; Dyer, Fred C.; Pennock, Robert T.; Ofria, Charles
2013-01-01
Investigating the evolution of animal behavior is difficult. The fossil record leaves few clues that would allow us to recapitulate the path that evolution took to build a complex behavior, and the large population sizes and long time scales required prevent us from re-evolving such behaviors in a laboratory setting. We present results of a study in which digital organisms–self-replicating computer programs that are subject to mutations and selection–evolved in different environments that required information about past experience for fitness-enhancing behavioral decisions. One population evolved a mechanism for step-counting, a surprisingly complex odometric behavior that was only indirectly related to enhancing fitness. We examine in detail the operation of the evolved mechanism and the evolutionary transitions that produced this striking example of a complex behavior. PMID:23577113
What is the strength of evidence for heart failure disease-management programs?
Clark, Alexander M; Savard, Lori A; Thompson, David R
2009-07-28
Heart failure (HF) disease-management programs are increasingly common. However, some large and recent trials of programs have not reported positive findings. There have also been parallel recent advances in reporting standards and theory around complex nonpharmacological interventions. These developments compel reconsideration in this Viewpoint of how research into HF-management programs should be evaluated, the quality, specificity, and usefulness of this evidence, and the recommendations for future research. Addressing the main determinants of intervention effectiveness by using the PICO (Patient, Intervention, Comparison, and Outcome) approach and the recent CONSORT (Consolidated Standards of Reporting Trials) statement on nonpharmacological trials, we will argue that in both current trials and meta-analyses, interventions and comparisons are not sufficiently well described; that complex programs have been excessively oversimplified; and that potentially salient differences in programs, populations, and settings are not incorporated into analyses. In preference to more general meta-analyses of programs, adequate descriptions are first needed of populations, interventions, comparisons, and outcomes in past and future trials. This could be achieved via a systematic survey of study authors based on the CONSORT statement. These more detailed data on studies should be incorporated into future meta-analyses of comparable trials and used with other techniques such as patient-based outcomes data and meta-regression. Although trials and meta-analyses continue to have potential to generate useful evidence, a more specific evidence base is needed to support the development of effective programs for different populations and settings.
Parallel solution of sparse one-dimensional dynamic programming problems
NASA Technical Reports Server (NTRS)
Nicol, David M.
1989-01-01
Parallel computation offers the potential for quickly solving large computational problems. However, it is often a non-trivial task to effectively use parallel computers. Solution methods must sometimes be reformulated to exploit parallelism; the reformulations are often more complex than their slower serial counterparts. We illustrate these points by studying the parallelization of sparse one-dimensional dynamic programming problems, those which do not obviously admit substantial parallelization. We propose a new method for parallelizing such problems, develop analytic models which help us to identify problems which parallelize well, and compare the performance of our algorithm with existing algorithms on a multiprocessor.
NASA Astrophysics Data System (ADS)
Konya, Andrew; Santangelo, Christian; Selinger, Robin
2014-03-01
When the underlying microstructure of an actuatable material varies in space, simple sheets can transform into complex shapes. Using nonlinear finite element elastodynamic simulations, we explore the design space of two such materials: liquid crystal elastomers and swelling polymer gels. Liquid crystal elastomers (LCE) undergo shape transformations induced by stimuli such as heating/cooling or illumination; complex deformations may be programmed by ``blueprinting'' a non-uniform director field in the sample when the polymer is cross-linked. Similarly, swellable gels can undergo shape change when they are swollen anisotropically as programmed by recently developed halftone gel lithography techniques. For each of these materials we design and test programmable motifs which give rise to complex deformation trajectories including folded structures, soft swimmers, apertures that open and close, bas relief patterns, and other shape transformations inspired by art and nature. In order to accommodate the large computational needs required to model these materials, our 3-d nonlinear finite element elastodynamics simulation algorithm is implemented in CUDA, running on a single GPU-enabled workstation.
Big Science and the Large Hadron Collider
NASA Astrophysics Data System (ADS)
Giudice, Gian Francesco
2012-03-01
The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.
Rapid spread of complex change: a case study in inpatient palliative care.
Della Penna, Richard; Martel, Helene; Neuwirth, Esther B; Rice, Jennifer; Filipski, Marta I; Green, Jennifer; Bellows, Jim
2009-12-29
Based on positive findings from a randomized controlled trial, Kaiser Permanente's national executive leadership group set an expectation that all Kaiser Permanente and partner hospitals would implement a consultative model of interdisciplinary, inpatient-based palliative care (IPC). Within one year, the number of IPC consultations program-wide increased almost tenfold from baseline, and the number of teams nearly doubled. We report here results from a qualitative evaluation of the IPC initiative after a year of implementation; our purpose was to understand factors supporting or impeding the rapid and consistent spread of a complex program. Quality improvement study using a case study design and qualitative analysis of in-depth semi-structured interviews with 36 national, regional, and local leaders. Compelling evidence of impacts on patient satisfaction and quality of care generated 'pull' among adopters, expressed as a remarkably high degree of conviction about the value of the model. Broad leadership agreement gave rise to sponsorship and support that permeated the organization. A robust social network promoted knowledge exchange and built on an existing network with a strong interest in palliative care. Resource constraints, pre-existing programs of a different model, and ambiguous accountability for implementation impeded spread. A complex, hospital-based, interdisciplinary intervention in a large health care organization spread rapidly due to a synergy between organizational 'push' strategies and grassroots-level pull. The combination of push and pull may be especially important when the organizational context or the practice to be spread is complex.
NASA Technical Reports Server (NTRS)
Cheng, L. Y.; Larsen, B.
2004-01-01
Launched in 1997, the Cassini-Huygens Mission sent the largest interplanetary spacecraft ever built in the service of science. Carrying a suite of 12 scientific instruments and an atmospheric entry probe, this complex spacecraft to explore the Saturn system may not have gotten off the ground without undergoing significant design changes and cost reductions.
Interoperable Acquisition for Systems of Systems: The Challenges
2006-09-01
Interoperable Acquisition for Systems of Systems: The Challenges James D. Smith II D. Mike Phillips September 2006 TECHNICAL NOTE...Failure of Program-Centric Risk Management 10 3.3.2 Absence of System-of-Systems Engineering 12 3.3.3 Disconnect Between System-of-Systems...SOFTWARE ENGINEERING INSTITUTE | vii viii | CMU/SEI-2006-TN-034 Abstract Large, complex systems development has always been challenging , even when the
Combining high performance simulation, data acquisition, and graphics display computers
NASA Technical Reports Server (NTRS)
Hickman, Robert J.
1989-01-01
Issues involved in the continuing development of an advanced simulation complex are discussed. This approach provides the capability to perform the majority of tests on advanced systems, non-destructively. The controlled test environments can be replicated to examine the response of the systems under test to alternative treatments of the system control design, or test the function and qualification of specific hardware. Field tests verify that the elements simulated in the laboratories are sufficient. The digital computer is hosted by a Digital Equipment Corp. MicroVAX computer with an Aptec Computer Systems Model 24 I/O computer performing the communication function. An Applied Dynamics International AD100 performs the high speed simulation computing and an Evans and Sutherland PS350 performs on-line graphics display. A Scientific Computer Systems SCS40 acts as a high performance FORTRAN program processor to support the complex, by generating numerous large files from programs coded in FORTRAN that are required for the real time processing. Four programming languages are involved in the process, FORTRAN, ADSIM, ADRIO, and STAPLE. FORTRAN is employed on the MicroVAX host to initialize and terminate the simulation runs on the system. The generation of the data files on the SCS40 also is performed with FORTRAN programs. ADSIM and ADIRO are used to program the processing elements of the AD100 and its IOCP processor. STAPLE is used to program the Aptec DIP and DIA processors.
Applying Principles from Complex Systems to Studying the Efficacy of CAM Therapies
Nahin, Richard L.; Calabrese, Carlo; Folkman, Susan; Kimbrough, Elizabeth; Shoham, Jacob; Haramati, Aviad
2010-01-01
Abstract In October 2007, a National Center for Complementary and Alternative Medicine (NCCAM)–sponsored workshop, entitled “Applying Principles from Complex Systems to Studying the Efficacy of CAM Therapies,” was held at Georgetown University in Washington, DC. Over a 2-day period, the workshop engaged a small group of experts from the fields of complementary and alternative medicine (CAM) research and complexity science to discuss and examine ways in which complexity science can be applied to CAM research. After didactic presentations and small-group discussions, a number of salient themes and ideas emerged. This paper article describes the workshop program and summarizes these emergent ideas, which are divided into five broad categories: (1) introduction to complexity; (2) challenges to CAM research; (3) applications of complexity science to CAM; (4) CAM as a model of complexity applied to medicine; and (5) future directions. This discusses possible benefits and challenges associated with applying complexity science to CAM research. By providing an introductory framework for this collaboration and exchange, it is hoped that this article may stimulate further inquiry into this largely unexplored area of research. PMID:20715978
Kalfa, David; Chai, Paul; Bacha, Emile
2014-08-01
A significant inverse relationship of surgical institutional and surgeon volumes to outcome has been demonstrated in many high-stakes surgical specialties. By and large, the same results were found in pediatric cardiac surgery, for which a more thorough analysis has shown that this relationship depends on case complexity and type of surgical procedures. Lower-volume programs tend to underperform larger-volume programs as case complexity increases. High-volume pediatric cardiac surgeons also tend to have better results than low-volume surgeons, especially at the more complex end of the surgery spectrum (e.g., the Norwood procedure). Nevertheless, this trend for lower mortality rates at larger centers is not universal. All larger programs do not perform better than all smaller programs. Moreover, surgical volume seems to account for only a small proportion of the overall between-center variation in outcome. Intraoperative technical performance is one of the most important parts, if not the most important part, of the therapeutic process and a critical component of postoperative outcome. Thus, the use of center-specific, risk-adjusted outcome as a tool for quality assessment together with monitoring of technical performance using a specific score may be more reliable than relying on volume alone. However, the relationship between surgical volume and outcome in pediatric cardiac surgery is strong enough that it ought to support adapted and well-balanced health care strategies that take advantage of the positive influence that higher center and surgeon volumes have on outcome.
Self-assembly kinetics of DNA functionalised liposomes
NASA Astrophysics Data System (ADS)
Mognetti, B. M.; Bachmann, S. J.; Kotar, J.; Parolini, L.; Petitzon, M.; Cicuta, P.; di Michele, L.
DNA has been largely used to program state-dependent interactions between functionalised Brownian units resulting in responsive systems featuring complex phase behaviours. In this talk I will show how DNA can also be used to control aggregation kinetics in systems of liposomes functionalised by three types of linkers that can simultaneously bind. In doing so, I will present a general coarse-graining strategy that allows calculating the adhesion free energy between pairs of compliant units functionalised by mobile binders. I will highlight the important role played by bilayer deformability and will calculate the free energy contribution due to the presence of complexes made by more than two binders. Finally we will demonstrate the importance of explicitly accounting for the kinetics underlying ligand-receptor reactions when studying large-scale self-assembly. We acknowledge support from ULB, the Oppenheimer Fund, and the EPSRC Programme Grant CAPITALS No. EP/J017566/1.
Industrial metrology as applied to large physics experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veal, D.
1993-05-01
A physics experiment is a large complex 3-D object (typ. 1200 m{sup 3}, 35000 tonnes), with sub-millimetric alignment requirements. Two generic survey alignment tasks can be identified; first, an iterative positioning of the apparatus subsystems in space and, second, a quantification of as-built parameters. The most convenient measurement technique is industrial triangulation but the complexity of the measured object and measurement environment constraints frequently requires a more sophisticated approach. To enlarge the ``survey alignment toolbox`` measurement techniques commonly associated with other disciplines such as geodesy, applied geodesy for accelerator alignment, and mechanical engineering are also used. Disparate observables require amore » heavy reliance on least squares programs for campaign pre-analysis and calculation. This paper will offer an introduction to the alignment of physics experiments and will identify trends for the next generation of SSC experiments.« less
MonALISA, an agent-based monitoring and control system for the LHC experiments
NASA Astrophysics Data System (ADS)
Balcas, J.; Kcira, D.; Mughal, A.; Newman, H.; Spiropulu, M.; Vlimant, J. R.
2017-10-01
MonALISA, which stands for Monitoring Agents using a Large Integrated Services Architecture, has been developed over the last fifteen years by California Insitute of Technology (Caltech) and its partners with the support of the software and computing program of the CMS and ALICE experiments at the Large Hadron Collider (LHC). The framework is based on Dynamic Distributed Service Architecture and is able to provide complete system monitoring, performance metrics of applications, Jobs or services, system control and global optimization services for complex systems. A short overview and status of MonALISA is given in this paper.
Space and energy. [space systems for energy generation, distribution and control
NASA Technical Reports Server (NTRS)
Bekey, I.
1976-01-01
Potential contributions of space to energy-related activities are discussed. Advanced concepts presented include worldwide energy distribution to substation-sized users using low-altitude space reflectors; powering large numbers of large aircraft worldwide using laser beams reflected from space mirror complexes; providing night illumination via sunlight-reflecting space mirrors; fine-scale power programming and monitoring in transmission networks by monitoring millions of network points from space; prevention of undetected hijacking of nuclear reactor fuels by space tracking of signals from tagging transmitters on all such materials; and disposal of nuclear power plant radioactive wastes in space.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poliakov, Alexander; Couronne, Olivier
2002-11-04
Aligning large vertebrate genomes that are structurally complex poses a variety of problems not encountered on smaller scales. Such genomes are rich in repetitive elements and contain multiple segmental duplications, which increases the difficulty of identifying true orthologous SNA segments in alignments. The sizes of the sequences make many alignment algorithms designed for comparing single proteins extremely inefficient when processing large genomic intervals. We integrated both local and global alignment tools and developed a suite of programs for automatically aligning large vertebrate genomes and identifying conserved non-coding regions in the alignments. Our method uses the BLAT local alignment program tomore » find anchors on the base genome to identify regions of possible homology for a query sequence. These regions are postprocessed to find the best candidates which are then globally aligned using the AVID global alignment program. In the last step conserved non-coding segments are identified using VISTA. Our methods are fast and the resulting alignments exhibit a high degree of sensitivity, covering more than 90% of known coding exons in the human genome. The GenomeVISTA software is a suite of Perl programs that is built on a MySQL database platform. The scheduler gets control data from the database, builds a queve of jobs, and dispatches them to a PC cluster for execution. The main program, running on each node of the cluster, processes individual sequences. A Perl library acts as an interface between the database and the above programs. The use of a separate library allows the programs to function independently of the database schema. The library also improves on the standard Perl MySQL database interfere package by providing auto-reconnect functionality and improved error handling.« less
Folta, Sara C; Koomas, Alyssa; Metayer, Nesly; Fullerton, Karen J; Hubbard, Kristie L; Anzman-Frasca, Stephanie; Hofer, Teresa; Nelson, Miriam; Newman, Molly; Sacheck, Jennifer; Economos, Christina
2015-12-24
Little effort has focused on the role of volunteer-led out-of-school time (OST) programs (ie, enrichment and sports programs) as key environments for the promotion of healthy eating and physical activity habits among school-aged children. The Healthy Kids Out of School (HKOS) initiative developed evidence-based, practical guiding principles for healthy snacks, beverages, and physical activity. The goal of this case study was to describe the methods used to engage regional partners to understand how successful implementation and dissemination of these principles could be accomplished. HKOS partnered with volunteer-led programs from 5 OST organizations in Maine, Massachusetts, and New Hampshire to create a regional "learning laboratory." We engaged partners in phases. In the first phase, we conducted focus groups with local volunteer program leaders; during the second phase, we held roundtable meetings with regional and state program administrators; and in the final phase, we conducted additional outreach to refine and finalize implementation strategies. Implementation strategies were developed based on themes and information that emerged. For enrichment programs, strategies included new patch and pin programs that were consistent with the organizations' infrastructure and usual practices. For sports programs, the main strategy was integration with online trainings for coaches. Through the engagement process, we learned that dissemination of the guiding principles in these large and complex OST organizations was best accomplished by using implementation strategies that were customized, integrated, and aligned with goals and usual practices. The lessons learned can benefit future efforts to prevent obesity in complex environments.
Compositional stratigraphy of crustal material from near-infrared spectra
NASA Technical Reports Server (NTRS)
Pieters, Carle M.
1987-01-01
An Earth-based telescopic program to acquire near-infrared spectra of freshly exposed lunar material now contains data for 17 large impact craters with central peaks. Noritic, gabbroic, anorthositic and troctolitic rock types can be distinguished for areas within these large craters from characteristic absorptions in individual spectra of their walls and central peaks. Norites dominate the upper lunar crust while the deeper crustal zones also contain significant amounts of gabbros and anorthosites. Data for material associated with large craters indicate that not only is the lunar crust highly heterogeneous across the nearside, but that the compositional stratigraphy of the lunar crust is nonuniform. Crustal complexity should be expected for other planetary bodies, which should be studied using high spatial and spectral resolution data in and around large impact craters.
Java Performance for Scientific Applications on LLNL Computer Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kapfer, C; Wissink, A
2002-05-10
Languages in use for high performance computing at the laboratory--Fortran (f77 and f90), C, and C++--have many years of development behind them and are generally considered the fastest available. However, Fortran and C do not readily extend to object-oriented programming models, limiting their capability for very complex simulation software. C++ facilitates object-oriented programming but is a very complex and error-prone language. Java offers a number of capabilities that these other languages do not. For instance it implements cleaner (i.e., easier to use and less prone to errors) object-oriented models than C++. It also offers networking and security as part ofmore » the language standard, and cross-platform executables that make it architecture neutral, to name a few. These features have made Java very popular for industrial computing applications. The aim of this paper is to explain the trade-offs in using Java for large-scale scientific applications at LLNL. Despite its advantages, the computational science community has been reluctant to write large-scale computationally intensive applications in Java due to concerns over its poor performance. However, considerable progress has been made over the last several years. The Java Grande Forum [1] has been promoting the use of Java for large-scale computing. Members have introduced efficient array libraries, developed fast just-in-time (JIT) compilers, and built links to existing packages used in high performance parallel computing.« less
NASA Technical Reports Server (NTRS)
Benavente, Javier E.; Luce, Norris R.
1989-01-01
Demands for nonlinear time history simulations of large, flexible multibody dynamic systems has created a need for efficient interfaces between finite-element modeling programs and time-history simulations. One such interface, TREEFLX, an interface between NASTRAN and TREETOPS, a nonlinear dynamics and controls time history simulation for multibody structures, is presented and demonstrated via example using the proposed Space Station Mobile Remote Manipulator System (MRMS). The ability to run all three programs (NASTRAN, TREEFLX and TREETOPS), in addition to other programs used for controller design and model reduction (such as DMATLAB and TREESEL, both described), under a UNIX Workstation environment demonstrates the flexibility engineers now have in designing, developing and testing control systems for dynamically complex systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryu, Jun-hyung
University education aims to supply qualified human resources for industries. In complex large scale engineering systems such as nuclear power plants, the importance of qualified human resources cannot be underestimated. The corresponding education program should involve many topics systematically. Recently a nuclear engineering program has been initiated in Dongguk University, South Korea. The current education program focuses on undergraduate level nuclear engineering students. Our main objective is to provide industries fresh engineers with the understanding on the interconnection of local parts and the entire systems of nuclear power plants and the associated systems. From the experience there is a hugemore » opportunity for chemical engineering disciple in the context of giving macroscopic overview on nuclear power plant and waste treatment management by strengthening the analyzing capability of fundamental situations. (authors)« less
NASA Technical Reports Server (NTRS)
Mckay, Charles; Auty, David; Rogers, Kathy
1987-01-01
System interface sets (SIS) for large, complex, non-stop, distributed systems are examined. The SIS of the Space Station Program (SSP) was selected as the focus of this study because an appropriate virtual interface specification of the SIS is believed to have the most potential to free the project from four life cycle tyrannies which are rooted in a dependance on either a proprietary or particular instance of: operating systems, data management systems, communications systems, and instruction set architectures. The static perspective of the common Ada programming support environment interface set (CAIS) and the portable common execution environment (PCEE) activities are discussed. Also, the dynamic perspective of the PCEE is addressed.
Dependency visualization for complex system understanding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smart, J. Allison Cory
1994-09-01
With the volume of software in production use dramatically increasing, the importance of software maintenance has become strikingly apparent. Techniques now sought and developed for reverse engineering and design extraction and recovery. At present, numerous commercial products and research tools exist which are capable of visualizing a variety of programming languages and software constructs. The list of new tools and services continues to grow rapidly. Although the scope of the existing commercial and academic product set is quite broad, these tools still share a common underlying problem. The ability of each tool to visually organize object representations is increasingly impairedmore » as the number of components and component dependencies within systems increases. Regardless of how objects are defined, complex ``spaghetti`` networks result in nearly all large system cases. While this problem is immediately apparent in modem systems analysis involving large software implementations, it is not new. As will be discussed in Chapter 2, related problems involving the theory of graphs were identified long ago. This important theoretical foundation provides a useful vehicle for representing and analyzing complex system structures. While the utility of directed graph based concepts in software tool design has been demonstrated in literature, these tools still lack the capabilities necessary for large system comprehension. This foundation must therefore be expanded with new organizational and visualization constructs necessary to meet this challenge. This dissertation addresses this need by constructing a conceptual model and a set of methods for interactively exploring, organizing, and understanding the structure of complex software systems.« less
Meteorological Support at the Savanna River Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Addis, Robert P.
2005-10-14
The Department of Energy (DOE) operates many nuclear facilities on large complexes across the United States in support of national defense. The operation of these many and varied facilities and processes require meteorological support for many purposes, including: for routine operations, to respond to severe weather events, such as lightning, tornadoes and hurricanes, to support the emergency response functions in the event of a release of materials to the environment, for engineering baseline and safety documentation, as well as hazards assessments etc. This paper describes a program of meteorological support to the Savannah River Site, a DOE complex located inmore » South Carolina.« less
Establishing and Maintaining an Extensive Library of Patient-Derived Xenograft Models.
Mattar, Marissa; McCarthy, Craig R; Kulick, Amanda R; Qeriqi, Besnik; Guzman, Sean; de Stanchina, Elisa
2018-01-01
Patient-derived xenograft (PDX) models have recently emerged as a highly desirable platform in oncology and are expected to substantially broaden the way in vivo studies are designed and executed and to reshape drug discovery programs. However, acquisition of patient-derived samples, and propagation, annotation and distribution of PDXs are complex processes that require a high degree of coordination among clinic, surgery and laboratory personnel, and are fraught with challenges that are administrative, procedural and technical. Here, we examine in detail the major aspects of this complex process and relate our experience in establishing a PDX Core Laboratory within a large academic institution.
A global view of atmospheric ice particle complexity
NASA Astrophysics Data System (ADS)
Schmitt, Carl G.; Heymsfield, Andrew J.; Connolly, Paul; Järvinen, Emma; Schnaiter, Martin
2016-11-01
Atmospheric ice particles exist in a variety of shapes and sizes. Single hexagonal crystals like common hexagonal plates and columns are possible, but more frequently, atmospheric ice particles are much more complex. Ice particle shapes have a substantial impact on many atmospheric processes through fall speed, affecting cloud lifetime, to radiative properties, affecting energy balance to name a few. This publication builds on earlier work where a technique was demonstrated to separate single crystals and aggregates of crystals using particle imagery data from aircraft field campaigns. Here data from 10 field programs have been analyzed and ice particle complexity parameterized by cloud temperature for arctic, midlatitude (summer and frontal), and tropical cloud systems. Results show that the transition from simple to complex particles can be as small as 80 µm or as large as 400 µm depending on conditions. All regimes show trends of decreasing transition size with decreasing temperature.
Mehdipanah, Roshanak; Malmusi, Davide; Muntaner, Carles; Borrell, Carme
2013-09-01
Urban renewal programs aim to improve physical and socioeconomic position of neighborhoods. However, due to the intervention's complexity, there is often little evidence of their impact on health and health inequalities. This study aimed to identify the perception of a group of neighborhood residents towards a large-scale urban renewal program in Barcelona and to explore its effects and importance on their wellbeing using concept mapping methodology. Our results indicate that the majority of urban renewal projects within the initiative, including improved walkability, construction of new public spaces and more community programs, have positive and important effects on the overall wellbeing of participants. This study presents an innovative method that diverts from traditional outcome-based evaluations studies often used within this field. Copyright © 2013 Elsevier Ltd. All rights reserved.
Biological Defense Research Program
1989-04-01
difference between life and death. Some recent examples are: BDRP developed VEE vaccine used in Central America, Mexico , and Texas (1969- 1971.) and Rift...Complex, is adn area owned by the Bureau of Land Management, which is available for grazina, and with specific permission, for use by DPG. 2.3...2.01 A Large European Laboratory, 1944-1950 50.00 Tuberculosis Laboratory 4 Technicians, Canada, 1947-1954 19.00 Research Institutes, 1930-1950 4.10
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
AvalonBay Communities, which is a large multifamily developer, was developing a three-building complex in Elmsford, New York. The buildings were planned to be certified to the ENERGY STAR® Homes Version 3 program. This plan led to AvalonBay partnering with the Advanced Residential Integrated Solutions (ARIES) collaborative, which is a U.S. Department of Energy Building America team. ARIES worked with AvalonBay to redesign the project to comply with Zero Energy Ready Home (ZERH) criteria.
Parallel transformation of K-SVD solar image denoising algorithm
NASA Astrophysics Data System (ADS)
Liang, Youwen; Tian, Yu; Li, Mei
2017-02-01
The images obtained by observing the sun through a large telescope always suffered with noise due to the low SNR. K-SVD denoising algorithm can effectively remove Gauss white noise. Training dictionaries for sparse representations is a time consuming task, due to the large size of the data involved and to the complexity of the training algorithms. In this paper, an OpenMP parallel programming language is proposed to transform the serial algorithm to the parallel version. Data parallelism model is used to transform the algorithm. Not one atom but multiple atoms updated simultaneously is the biggest change. The denoising effect and acceleration performance are tested after completion of the parallel algorithm. Speedup of the program is 13.563 in condition of using 16 cores. This parallel version can fully utilize the multi-core CPU hardware resources, greatly reduce running time and easily to transplant in multi-core platform.
Shrader, Sarah; Hodgkins, Renee; Laverentz, Delois; Zaudke, Jana; Waxman, Michael; Johnston, Kristy; Jernigan, Stephen
2016-09-01
Health profession educators and administrators are interested in how to develop an effective and sustainable interprofessional education (IPE) programme. We describe the approach used at the University of Kansas Medical Centre, Kansas City, United States. This approach is a foundational programme with multiple large-scale, half-day events each year. The programme is threaded with common curricular components that build in complexity over time and assures that each learner is exposed to IPE. In this guide, lessons learned and general principles related to the development of IPE programming are discussed. Important areas that educators should consider include curriculum development, engaging leadership, overcoming scheduling barriers, providing faculty development, piloting the programming, planning for logistical coordination, intentionally pairing IP facilitators, anticipating IP conflict, setting clear expectations for learners, publicising the programme, debriefing with faculty, planning for programme evaluation, and developing a scholarship and dissemination plan.
An expert system executive for automated assembly of large space truss structures
NASA Technical Reports Server (NTRS)
Allen, Cheryl L.
1993-01-01
Langley Research Center developed a unique test bed for investigating the practical problems associated with the assembly of large space truss structures using robotic manipulators. The test bed is the result of an interdisciplinary effort that encompasses the full spectrum of assembly problems - from the design of mechanisms to the development of software. The automated structures assembly test bed and its operation are described, the expert system executive and its development are detailed, and the planned system evolution is discussed. Emphasis is on the expert system implementation of the program executive. The executive program must direct and reliably perform complex assembly tasks with the flexibility to recover from realistic system errors. The employment of an expert system permits information that pertains to the operation of the system to be encapsulated concisely within a knowledge base. This consolidation substantially reduced code, increased flexibility, eased software upgrades, and realized a savings in software maintenance costs.
Large Eddy Simulation of Engineering Flows: A Bill Reynolds Legacy.
NASA Astrophysics Data System (ADS)
Moin, Parviz
2004-11-01
The term, Large eddy simulation, LES, was coined by Bill Reynolds, thirty years ago when he and his colleagues pioneered the introduction of LES in the engineering community. Bill's legacy in LES features his insistence on having a proper mathematical definition of the large scale field independent of the numerical method used, and his vision for using numerical simulation output as data for research in turbulence physics and modeling, just as one would think of using experimental data. However, as an engineer, Bill was pre-dominantly interested in the predictive capability of computational fluid dynamics and in particular LES. In this talk I will present the state of the art in large eddy simulation of complex engineering flows. Most of this technology has been developed in the Department of Energy's ASCI Program at Stanford which was led by Bill in the last years of his distinguished career. At the core of this technology is a fully implicit non-dissipative LES code which uses unstructured grids with arbitrary elements. A hybrid Eulerian/ Largangian approach is used for multi-phase flows, and chemical reactions are introduced through dynamic equations for mixture fraction and reaction progress variable in conjunction with flamelet tables. The predictive capability of LES is demonstrated in several validation studies in flows with complex physics and complex geometry including flow in the combustor of a modern aircraft engine. LES in such a complex application is only possible through efficient utilization of modern parallel super-computers which was recognized and emphasized by Bill from the beginning. The presentation will include a brief mention of computer science efforts for efficient implementation of LES.
Large Instrument Development for Radio Astronomy
NASA Astrophysics Data System (ADS)
Fisher, J. Richard; Warnick, Karl F.; Jeffs, Brian D.; Norrod, Roger D.; Lockman, Felix J.; Cordes, James M.; Giovanelli, Riccardo
2009-03-01
This white paper offers cautionary observations about the planning and development of new, large radio astronomy instruments. Complexity is a strong cost driver so every effort should be made to assign differing science requirements to different instruments and probably different sites. The appeal of shared resources is generally not realized in practice and can often be counterproductive. Instrument optimization is much more difficult with longer lists of requirements, and the development process is longer and less efficient. More complex instruments are necessarily further behind the technology state of the art because of longer development times. Including technology R&D in the construction phase of projects is a growing trend that leads to higher risks, cost overruns, schedule delays, and project de-scoping. There are no technology breakthroughs just over the horizon that will suddenly bring down the cost of collecting area. Advances come largely through careful attention to detail in the adoption of new technology provided by industry and the commercial market. Radio astronomy instrumentation has a very bright future, but a vigorous long-term R&D program not tied directly to specific projects needs to be restored, fostered, and preserved.
High performance computing in biology: multimillion atom simulations of nanoscale systems
Sanbonmatsu, K. Y.; Tung, C.-S.
2007-01-01
Computational methods have been used in biology for sequence analysis (bioinformatics), all-atom simulation (molecular dynamics and quantum calculations), and more recently for modeling biological networks (systems biology). Of these three techniques, all-atom simulation is currently the most computationally demanding, in terms of compute load, communication speed, and memory load. Breakthroughs in electrostatic force calculation and dynamic load balancing have enabled molecular dynamics simulations of large biomolecular complexes. Here, we report simulation results for the ribosome, using approximately 2.64 million atoms, the largest all-atom biomolecular simulation published to date. Several other nanoscale systems with different numbers of atoms were studied to measure the performance of the NAMD molecular dynamics simulation program on the Los Alamos National Laboratory Q Machine. We demonstrate that multimillion atom systems represent a 'sweet spot' for the NAMD code on large supercomputers. NAMD displays an unprecedented 85% parallel scaling efficiency for the ribosome system on 1024 CPUs. We also review recent targeted molecular dynamics simulations of the ribosome that prove useful for studying conformational changes of this large biomolecular complex in atomic detail. PMID:17187988
NASA Technical Reports Server (NTRS)
Hudiburg, John J.; Chinworth, Michael W.
2005-01-01
The President's Commission on Implementation of United States Space Exploration Policy suggests that after NASA establishes the Space Exploration vision architecture, it should pursue international partnerships. Two possible approaches were suggested: multiple independently operated missions and an integrated mission with carefully selected international components. The U.S.-Japan defense sectors have learned key lessons from experience with both of these approaches. U.S.-Japan defense cooperation has evolved over forty years from simple military assistance programs to more complex joint development efforts. With the evolution of the political-military alliance and the complexity of defense programs, these cooperative efforts have engaged increasingly industrial resources and capabilities as well as more sophisticated forms of planning, technology transfers and program management. Some periods of this evolution have been marked by significant frictions. The U.S.Japan FS-X program, for example, provides a poor example for management of international cooperation. In November 1988, the United States and Japan signed a Memorandum of Understanding (MOU) to co-develop an aircraft, named FS-X and later renamed F -2, as a replacement to the aging Japan support fighter F-l. The program was marked by numerous political disputes. After over a decade of joint development and testing, F -2 production deliveries finally began in 1999. The production run was curtailed due to much higher than anticipated costs and less than desired aircraft performance. One universally agreed "lesson" from the FSX/F-2 case was that it did not represent the ideal approach to bilateral cooperation. More recent cooperative programs have involved targeted joint research and development, including component development for ballistic missile defense systems. These programs could lay the basis for more ambitious cooperative efforts. This study examines both less-than-stellar international cooperation efforts as well as more successful initiatives to identify lessons from military programs that can help NASA encourage global investment in its Space Exploration Vision. The paper establishes a basis for examining related policy and industrial concerns such as effective utilization of dual-use technologies and trans-Pacific program management of large, complex cooperative programs.
An automated method for finding molecular complexes in large protein interaction networks
Bader, Gary D; Hogue, Christopher WV
2003-01-01
Background Recent advances in proteomics technologies such as two-hybrid, phage display and mass spectrometry have enabled us to create a detailed map of biomolecular interaction networks. Initial mapping efforts have already produced a wealth of data. As the size of the interaction set increases, databases and computational methods will be required to store, visualize and analyze the information in order to effectively aid in knowledge discovery. Results This paper describes a novel graph theoretic clustering algorithm, "Molecular Complex Detection" (MCODE), that detects densely connected regions in large protein-protein interaction networks that may represent molecular complexes. The method is based on vertex weighting by local neighborhood density and outward traversal from a locally dense seed protein to isolate the dense regions according to given parameters. The algorithm has the advantage over other graph clustering methods of having a directed mode that allows fine-tuning of clusters of interest without considering the rest of the network and allows examination of cluster interconnectivity, which is relevant for protein networks. Protein interaction and complex information from the yeast Saccharomyces cerevisiae was used for evaluation. Conclusion Dense regions of protein interaction networks can be found, based solely on connectivity data, many of which correspond to known protein complexes. The algorithm is not affected by a known high rate of false positives in data from high-throughput interaction techniques. The program is available from . PMID:12525261
Beissner, Katherine L.; Bach, Eileen; Murtaugh, Christopher M.; Trifilio, MaryGrace; Henderson, Charles R.; Barrón, Yolanda; Trachtenberg, Melissa A.; Reid, M. Carrington
2017-01-01
Activity-limiting pain is common among older home care patients and pain management is complicated by the high prevalence of physical frailty and multimorbidity in the home care population. A comparative effectiveness study was undertaken at a large urban home care agency to examine an evidence-based pain self-management program delivered by physical therapists (PTs). This article focuses on PT training, methods implemented to reinforce content after training and to encourage uptake of the program with appropriate patients, and therapists’ fidelity to the program. Seventeen physical therapy teams were included in the cluster randomized controlled trial, with 8 teams (155 PTs) assigned to a control and 9 teams (165 PTs) assigned to a treatment arm. Treatment therapists received interactive training over two sessions, with a follow-up session 6 months later. Additional support was provided via emails, e-learning materials including videos, and a therapist manual. Program fidelity was assessed by examining PT pain documentation in the agency’s electronic health record. PT feedback on the program was obtained via semistructured surveys. There were no between-group differences in the number of PTs documenting program elements with the exception of instruction in the use of imagery, which was documented by a higher percentage of intervention therapists (p = 0.002). PTs felt comfortable teaching the program elements, but cited time as the biggest barrier to implementing the protocol. Possible explanations for study results suggesting limited adherence to the program protocol by intervention-group PTs include the top-down implementation strategy, competing organizational priorities, program complexity, competing patient priorities, and inadequate patient buy-in. Implications for the implementation of complex new programs in the home healthcare setting are discussed. PMID:28157776
Petasis, Doros T; Hendrich, Michael P
2015-01-01
Electron paramagnetic resonance (EPR) spectroscopy has long been a primary method for characterization of paramagnetic centers in materials and biological complexes. Transition metals in biological complexes have valence d-orbitals that largely define the chemistry of the metal centers. EPR spectra are distinctive for metal type, oxidation state, protein environment, substrates, and inhibitors. The study of many metal centers in proteins, enzymes, and biomimetic complexes has led to the development of a systematic methodology for quantitative interpretation of EPR spectra from a wide array of metal containing complexes. The methodology is now contained in the computer program SpinCount. SpinCount allows simulation of EPR spectra from any sample containing multiple species composed of one or two metals in any spin state. The simulations are quantitative, thus allowing determination of all species concentrations in a sample directly from spectra. This chapter will focus on applications to transition metals in biological systems using EPR spectra from multiple microwave frequencies and modes. © 2015 Elsevier Inc. All rights reserved.
A brief introduction to PYTHIA 8.1
NASA Astrophysics Data System (ADS)
Sjöstrand, Torbjörn; Mrenna, Stephen; Skands, Peter
2008-06-01
The PYTHIA program is a standard tool for the generation of high-energy collisions, comprising a coherent set of physics models for the evolution from a few-body hard process to a complex multihadronic final state. It contains a library of hard processes and models for initial- and final-state parton showers, multiple parton-parton interactions, beam remnants, string fragmentation and particle decays. It also has a set of utilities and interfaces to external programs. While previous versions were written in Fortran, PYTHIA 8 represents a complete rewrite in C++. The current release is the first main one after this transition, and does not yet in every respect replace the old code. It does contain some new physics aspects, on the other hand, that should make it an attractive option especially for LHC physics studies. Program summaryProgram title:PYTHIA 8.1 Catalogue identifier: ACTU_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ACTU_v3_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL version 2 No. of lines in distributed program, including test data, etc.: 176 981 No. of bytes in distributed program, including test data, etc.: 2 411 876 Distribution format: tar.gz Programming language: C++ Computer: Commodity PCs Operating system: Linux; should also work on other systems RAM: 8 megabytes Classification: 11.2 Does the new version supersede the previous version?: yes, partly Nature of problem: High-energy collisions between elementary particles normally give rise to complex final states, with large multiplicities of hadrons, leptons, photons and neutrinos. The relation between these final states and the underlying physics description is not a simple one, for two main reasons. Firstly, we do not even in principle have a complete understanding of the physics. Secondly, any analytical approach is made intractable by the large multiplicities. Solution method: Complete events are generated by Monte Carlo methods. The complexity is mastered by a subdivision of the full problem into a set of simpler separate tasks. All main aspects of the events are simulated, such as hard-process selection, initial- and final-state radiation, beam remnants, fragmentation, decays, and so on. Therefore events should be directly comparable with experimentally observable ones. The programs can be used to extract physics from comparisons with existing data, or to study physics at future experiments. Reasons for new version: Improved and expanded physics models, transition from Fortran to C++. Summary of revisions: New user interface, transverse-momentum-ordered showers, interleaving with multiple interactions, and much more. Restrictions: Depends on the problem studied. Running time: 10-1000 events per second, depending on process studied. References: [1] T. Sjöstrand, P. Edén, C. Friberg, L. Lönnblad, G. Miu, S. Mrenna, E. Norrbin, Comput. Phys. Comm. 135 (2001) 238.
Friedman, Karen A; Raimo, John; Spielmann, Kelly; Chaudhry, Saima
2016-01-01
Introduction Under the Next Accreditation System, programs need to find ways to collect and assess meaningful reportable information on its residents to assist the program director regarding resident milestone progression. This paper discusses the process that one large Internal Medicine Residency Program used to provide both quantitative and qualitative data to its clinical competency committee (CCC) through the creation of a resident dashboard. Methods Program leadership at a large university-based program developed four new end of rotation evaluations based on the American Board of Internal Medicine (ABIM) and Accreditation Council of Graduated Medical Education's (ACGME) 22 reportable milestones. A resident dashboard was then created to pull together both milestone- and non-milestone-based quantitative data and qualitative data compiled from faculty, nurses, peers, staff, and patients. Results Dashboards were distributed to the members of the CCC in preparation for the semiannual CCC meeting. CCC members adjudicated quantitative and qualitative data to present their cohort of residents at the CCC meeting. Based on the committee's response, evaluation scores remained the same or were adjusted. Final milestone scores were then entered into the accreditation data system (ADS) on the ACGME website. Conclusions The process of resident assessment is complex and should comprise both quantitative and qualitative data. The dashboard is a valuable tool for program leadership to use both when evaluating house staff on a semiannual basis at the CCC and to the resident in person.
Friedman, Karen A; Raimo, John; Spielmann, Kelly; Chaudhry, Saima
2016-01-01
Under the Next Accreditation System, programs need to find ways to collect and assess meaningful reportable information on its residents to assist the program director regarding resident milestone progression. This paper discusses the process that one large Internal Medicine Residency Program used to provide both quantitative and qualitative data to its clinical competency committee (CCC) through the creation of a resident dashboard. Program leadership at a large university-based program developed four new end of rotation evaluations based on the American Board of Internal Medicine (ABIM) and Accreditation Council of Graduated Medical Education's (ACGME) 22 reportable milestones. A resident dashboard was then created to pull together both milestone- and non-milestone-based quantitative data and qualitative data compiled from faculty, nurses, peers, staff, and patients. Dashboards were distributed to the members of the CCC in preparation for the semiannual CCC meeting. CCC members adjudicated quantitative and qualitative data to present their cohort of residents at the CCC meeting. Based on the committee's response, evaluation scores remained the same or were adjusted. Final milestone scores were then entered into the accreditation data system (ADS) on the ACGME website. The process of resident assessment is complex and should comprise both quantitative and qualitative data. The dashboard is a valuable tool for program leadership to use both when evaluating house staff on a semiannual basis at the CCC and to the resident in person.
NASA Astrophysics Data System (ADS)
Knudson, Christa K.; Kemp, Michael C.; Lombardo, Nicholas J.
2009-05-01
The U.S. Department of Homeland Security's Standoff Technology Integration and Demonstration Program is designed to accelerate the development and integration of technologies, concepts of operations, and training to defeat explosives attacks at large public events and mass transit facilities. The program will address threats posed by suicide bombers, vehicle-borne improvised explosive devices, and leave-behind bombs. The program is focused on developing and testing explosives countermeasure architectures using commercial off-the-shelf and near-commercial standoff and remotely operated detection technologies in prototypic operational environments. An important part of the program is the integration of multiple technologies and systems to protect against a wider range of threats, improve countermeasure performance, increase the distance from the venue at which screening is conducted, and reduce staffing requirements. The program will routinely conduct tests in public venues involving successively more advanced technology, higher levels of system integration, and more complex scenarios. This paper describes the initial field test of an integrated countermeasure system that included infrared, millimeter-wave, and video analytics technologies for detecting person-borne improvised explosive devices at a public arena. The test results are being used to develop a concept for the next generation of integrated countermeasures, to refine technical and operational requirements for architectures and technologies, and engage industry and academia in solution development.
Thermal Imaging for Inspection of Large Cryogenic Tanks
NASA Technical Reports Server (NTRS)
Arens, Ellen
2012-01-01
The end of the Shuttle Program provides an opportunity to evaluate and possibly refurbish launch support infrastructure at the Kennedy Space Center in support of future launch vehicles. One major infrastructure element needing attention is the cryogenic fuel and oxidizer system and specifically the cryogenic fuel ground storage tanks located at Launch Complex 39. These tanks were constructed in 1965 and served both the Apollo and Shuttle Programs and will be used to support future launch programs. However, they have received only external inspection and minimal refurbishment over the years as there were no operational issues that warranted the significant time and schedule disruption required to drain and refurbish the tanks while the launch programs were ongoing. Now, during the break between programs, the health of the tanks is being evaluated and refurbishment is being performed as necessary to maintain their fitness for future launch programs. Thermography was used as one part of the inspection and analysis of the tanks. This paper will describe the conclusions derived from the thermal images to evaluate anomalous regions in the tanks, confirm structural integrity of components within the annular region, and evaluate the effectiveness of thermal imaging to detect large insulation voids in tanks prior to filling with cryogenic fluid. The use of thermal imaging as a tool to inspect unfilled tanks will be important if the construction of additional storage tanks is required to fuel new launch vehicles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knudson, Christa K.; Kemp, Michael C.; Lombardo, Nicholas J.
The Department of Homeland Security’s Standoff Technology Integration and Demonstration Program is designed to accelerate the development and integration of technologies, concepts of operations, and training to prevent explosives attacks at large public events and mass transit facilities. The program will address threats posed by suicide bombers, vehicle-borne improvised explosive devices, and leave-behind bombs. The program is focused on developing and testing explosives countermeasure architectures using commercial off-the-shelf and near-commercial standoff and remotely operated detection technologies in prototypic operational environments. An important part of the program is the integration of multiple technologies and systems to protect against a wider rangemore » of threats, improve countermeasure performance, increase the distance from the venue at which screening is conducted, and reduce staffing requirements. The program will routinely conduct tests in public venues involving successively more advanced technology, higher levels of system integration, and more complex scenarios. This paper describes the initial field test of an integrated countermeasure system that included infrared, millimeter-wave, and video analytics technologies for detecting person-borne improvised explosive devices at a public arena. The test results are being used to develop a concept for the next generation of integrated countermeasures, to refine technical and operational requirements for architectures and technologies, and engage industry and academia in solution development.« less
Flying Cassini with Virtual Operations Teams
NASA Technical Reports Server (NTRS)
Dodd, Suzanne; Gustavson, Robert
1998-01-01
The Cassini Program's challenge is to fly a large, complex mission with a reduced operations budget. A consequence of the reduced budget is elimination of the large, centrally located group traditionally used for uplink operations. Instead, responsibility for completing parts of the uplink function is distributed throughout the Program. A critical strategy employed to handle this challenge is the use of Virtual Uplink Operations Teams. A Virtual Team is comprised of a group of people with the necessary mix of engineering and science expertise who come together for the purpose of building a specific uplink product. These people are drawn from throughout the Cassini Program and participate across a large geographical area (from Germany to the West coast of the USA), covering ten time zones. The participants will often split their time between participating in the Virtual Team and accomplishing their core responsibilities, requiring significant planning and time management. When the particular uplink product task is complete, the Virtual Team disbands and the members turn back to their home organization element for future work assignments. This time-sharing of employees is used on Cassini to build mission planning products, via the Mission Planning Virtual Team, and sequencing products and monitoring of the sequence execution, via the Sequence Virtual Team. This challenging, multitasking approach allows efficient use of personnel in a resource constrained environment.
Program Helps Decompose Complex Design Systems
NASA Technical Reports Server (NTRS)
Rogers, James L., Jr.; Hall, Laura E.
1995-01-01
DeMAID (Design Manager's Aid for Intelligent Decomposition) computer program is knowledge-based software system for ordering sequence of modules and identifying possible multilevel structure for design problems such as large platforms in outer space. Groups modular subsystems on basis of interactions among them. Saves considerable amount of money and time in total design process, particularly in new design problem in which order of modules has not been defined. Originally written for design problems, also applicable to problems containing modules (processes) that take inputs and generate outputs. Available in three machine versions: Macintosh written in Symantec's Think C 3.01, Sun, and SGI IRIS in C language.
Roberts, Daniel P; Lohrke, Scott M
2003-01-01
A number of USDA-ARS programs directed at overcoming impediments to the use of biocontrol agents on a commercial scale are described. These include improvements in screening techniques, taxonomic studies to identify beneficial strains more precisely, and studies on various aspects of the large-scale production of biocontrol agents. Another broad area of studies covers the ecological aspects of biocontrol agents-their interaction with the pathogen, with the plant and with other aspects of the environmental complex. Examples of these studies are given and their relevance to the further development and expansion of biocontrol agents is discussed.
NASA Technical Reports Server (NTRS)
Housner, J. M.; Anderson, M.; Belvin, W.; Horner, G.
1985-01-01
Dynamic analysis of large space antenna systems must treat the deployment as well as vibration and control of the deployed antenna. Candidate computer programs for deployment dynamics, and issues and needs for future program developments are reviewed. Some results for mast and hoop deployment are also presented. Modeling of complex antenna geometry with conventional finite element methods and with repetitive exact elements is considered. Analytical comparisons with experimental results for a 15 meter hoop/column antenna revealed the importance of accurate structural properties including nonlinear joints. Slackening of cables in this antenna is also a consideration. The technology of designing actively damped structures through analytical optimization is discussed and results are presented.
The role of small missions in planetary and lunar exploration
NASA Technical Reports Server (NTRS)
1995-01-01
The Space Studies Board of the National Research Council charged its Committee on Planetary and Lunar Exploration (COMPLEX) to (1) examine the degree to which small missions, such as those fitting within the constraints of the Discovery program, can achieve priority objectives in the lunar and planetary sciences; (2) determine those characteristics, such as level of risk, flight rate, target mix, university involvement, technology development, management structure and procedures, and so on, that could allow a successful program; (3) assess issues, such as instrument selection, mission operations, data analysis, and data archiving, to ensure the greatest scientific return from a particular mission, given a rapid deployment schedule and a tightly constrained budget; and (4) review past programmatic attempts to establish small planetary science mission lines, including the Planetary Observers and Planetary Explorers, and consider the impact management practices have had on such programs. A series of small missions presents the planetary science community with the opportunity to expand the scope of its activities and to develop the potential and inventiveness of its members in ways not possible within the confines of large, traditional programs. COMPLEX also realized that a program of small planetary missions was, in and of itself, incapable of meeting all of the prime objectives contained in its report 'An Integrated Strategy for the Planetary Sciences: 1995-2010.' Recommendations are provided for the small planetary missions to fulfill their promise.
McBride, Matthew K; Podgorski, Maciej; Chatani, Shunsuke; Worrell, Brady T; Bowman, Christopher N
2018-06-21
Ductile, cross-linked films were folded as a means to program temporary shapes without the need for complex heating cycles or specialized equipment. Certain cross-linked polymer networks, formed here with the thiol-isocyanate reaction, possessed the ability to be pseudoplastically deformed below the glass transition, and the original shape was recovered during heating through the glass transition. To circumvent the large forces required to plastically deform a glassy polymer network, we have utilized folding, which localizes the deformation in small creases, and achieved large dimensional changes with simple programming procedures. In addition to dimension changes, three-dimensional objects such as swans and airplanes were developed to demonstrate applying origami principles to shape memory. We explored the fundamental mechanical properties that are required to fold polymer sheets and observed that a yield point that does not correspond to catastrophic failure is required. Unfolding occurred during heating through the glass transition, indicating the vitrification of the network that maintained the temporary, folded shape. Folding was demonstrated as a powerful tool to simply and effectively program ductile shape-memory polymers without the need for thermal cycling.
Glass sample preparation and performance investigations
NASA Astrophysics Data System (ADS)
Johnson, R. Barry
1992-04-01
This final report details the work performed under this delivery order from April 1991 through April 1992. The currently available capabilities for integrated optical performance modeling at MSFC for large and complex systems such as AXAF were investigated. The Integrated Structural Modeling (ISM) program developed by Boeing for the U.S. Air Force was obtained and installed on two DECstations 5000 at MSFC. The structural, thermal and optical analysis programs available in ISM were evaluated. As part of the optomechanical engineering activities, technical support was provided in the design of support structure, mirror assembly, filter wheel assembly and material selection for the Solar X-ray Imager (SXI) program. As part of the fabrication activities, a large number of zerodur glass samples were prepared in different sizes and shapes for acid etching, coating and polishing experiments to characterize the subsurface damage and stresses produced by the grinding and polishing operations. Various optical components for AXAF video microscope and the x-ray test facility were also fabricated. A number of glass fabrication and test instruments such as a scatter plate interferometer, a gravity feed saw and some phenolic cutting blades were fabricated, integrated and tested.
A Large-Scale Assessment of Nucleic Acids Binding Site Prediction Programs
Miao, Zhichao; Westhof, Eric
2015-01-01
Computational prediction of nucleic acid binding sites in proteins are necessary to disentangle functional mechanisms in most biological processes and to explore the binding mechanisms. Several strategies have been proposed, but the state-of-the-art approaches display a great diversity in i) the definition of nucleic acid binding sites; ii) the training and test datasets; iii) the algorithmic methods for the prediction strategies; iv) the performance measures and v) the distribution and availability of the prediction programs. Here we report a large-scale assessment of 19 web servers and 3 stand-alone programs on 41 datasets including more than 5000 proteins derived from 3D structures of protein-nucleic acid complexes. Well-defined binary assessment criteria (specificity, sensitivity, precision, accuracy…) are applied. We found that i) the tools have been greatly improved over the years; ii) some of the approaches suffer from theoretical defects and there is still room for sorting out the essential mechanisms of binding; iii) RNA binding and DNA binding appear to follow similar driving forces and iv) dataset bias may exist in some methods. PMID:26681179
Nelson, Geoffrey; Macnaughton, Eric; Goering, Paula
2015-11-01
Using the case of a large-scale, multi-site Canadian Housing First research demonstration project for homeless people with mental illness, At Home/Chez Soi, we illustrate the value of qualitative methods in a randomized controlled trial (RCT) of a complex community intervention. We argue that quantitative RCT research can neither capture the complexity nor tell the full story of a complex community intervention. We conceptualize complex community interventions as having multiple phases and dimensions that require both RCT and qualitative research components. Rather than assume that qualitative research and RCTs are incommensurate, a more pragmatic mixed methods approach was used, which included using both qualitative and quantitative methods to understand program implementation and outcomes. At the same time, qualitative research was used to examine aspects of the intervention that could not be understood through the RCT, such as its conception, planning, sustainability, and policy impacts. Through this example, we show how qualitative research can tell a more complete story about complex community interventions. Copyright © 2015 Elsevier Inc. All rights reserved.
Best geoscience approach to complex systems in environment
NASA Astrophysics Data System (ADS)
Mezemate, Yacine; Tchiguirinskaia, Ioulia; Schertzer, Daniel
2017-04-01
The environment is a social issue that continues to grow in importance. Its complexity, both cross-disciplinary and multi-scale, has given rise to a large number of scientific and technological locks, that complex systems approaches can solve. Significant challenges must met to achieve the understanding of the environmental complexes systems. There study should proceed in some steps in which the use of data and models is crucial: - Exploration, observation and basic data acquisition - Identification of correlations, patterns, and mechanisms - Modelling - Model validation, implementation and prediction - Construction of a theory Since the e-learning becomes a powerful tool for knowledge and best practice shearing, we use it to teach the environmental complexities and systems. In this presentation we promote the e-learning course dedicated for a large public (undergraduates, graduates, PhD students and young scientists) which gather and puts in coherence different pedagogical materials of complex systems and environmental studies. This course describes a complex processes using numerous illustrations, examples and tests that make it "easy to enjoy" learning process. For the seek of simplicity, the course is divided in different modules and at the end of each module a set of exercises and program codes are proposed for a best practice. The graphical user interface (GUI) which is constructed using an open source Opale Scenari offers a simple navigation through the different module. The course treats the complex systems that can be found in environment and their observables, we particularly highlight the extreme variability of these observables over a wide range of scales. Using the multifractal formalism through different applications (turbulence, precipitation, hydrology) we demonstrate how such extreme variability of the geophysical/biological fields should be used solving everyday (geo-)environmental chalenges.
Case management: a case study.
Stanton, M P; Walizer, E M; Graham, J I; Keppel, L
2000-01-01
This article describes the implementation of a pilot case management program at Walter Reed Army Medical Center. I, it we discuss obvious pitfalls and problems implementing case management in a large multiservice center and the steps and processes implemented to expedite and move case management forward in its early stages. The insights shared may be useful for those implementing case management in a complex medical center situation. Other models used in similar situations are also reviewed.
NASA Technical Reports Server (NTRS)
Imbriale, W. A.; Moore, M.; Rochblatt, D. J.; Veruttipong, W.
1995-01-01
At the NASA Deep Space Network (DSN) Goldstone Complex, a 34-meter- diameter beam-waveguide antenna, DSS-13, was constructed in 1988-1990 and has become an integral part of an advanced systems program and a test bed for technologies being developed to introduce Ka-band (32 GHz) frequencies into the DSN. A method for compensating the gravity- induced structural deformations in this large antenna is presented.
Scalable Multiplexed Ion Trap (SMIT) Program
2010-12-08
an integrated micromirror . The symmetric cross and the mirror trap had a number of complex design features. Both traps shaped the electrodes in...genetic algorithm. 6. Integrated micromirror . The Gen II linear trap (as well as the linear sections of the mirror and the cross) had a number of new...conventional imaging system constructed by off-the-shelf optical components and a micromirror located very close to the ion. A large fraction of photons
Ball bearing heat analysis program (BABHAP)
NASA Technical Reports Server (NTRS)
1978-01-01
The Ball Bearing Heat Analysis Program (BABHAP) is an attempt to assemble a series of equations, some of which are non-linear algebraic systems, in a logical order, which when solved, provide a complex analysis of load distribution among the balls, ball velocities, heat generation resulting from friction, applied load, and ball spinning, minimum lubricant film thickness, and many additional characteristics of ball bearing systems. Although initial design requirements for BABHAP were dictated by the core limitations of the PDP 11/45 computer, (approximately 8K of real words with limited number of instructions) the program dimensions can easily be expanded for large core computers such as the UNIVAC 1108. The PDP version of BABHAP is also operational on the UNIVAC system with the exception that the PDP uses 029 punch and the UNIVAC uses 026. A conversion program was written to allow transfer between machines.
Computers and the design of ion beam optical systems
NASA Astrophysics Data System (ADS)
White, Nicholas R.
Advances in microcomputers have made it possible to maintain a library of advanced ion optical programs which can be used on inexpensive computer hardware, which are suitable for the design of a variety of ion beam systems including ion implanters, giving excellent results. This paper describes in outline the steps typically involved in designing a complete ion beam system for materials modification applications. Two computer programs are described which, although based largely on algorithms which have been in use for many years, make possible detailed beam optical calculations using microcomputers, specifically the IBM PC. OPTICIAN is an interactive first-order program for tracing beam envelopes through complex optical systems. SORCERY is a versatile program for solving Laplace's and Poisson's equations by finite difference methods using successive over-relaxation. Ion and electron trajectories can be traced through these potential fields, and plots of beam emittance obtained.
A Hybrid Genetic Programming Algorithm for Automated Design of Dispatching Rules.
Nguyen, Su; Mei, Yi; Xue, Bing; Zhang, Mengjie
2018-06-04
Designing effective dispatching rules for production systems is a difficult and timeconsuming task if it is done manually. In the last decade, the growth of computing power, advanced machine learning, and optimisation techniques has made the automated design of dispatching rules possible and automatically discovered rules are competitive or outperform existing rules developed by researchers. Genetic programming is one of the most popular approaches to discovering dispatching rules in the literature, especially for complex production systems. However, the large heuristic search space may restrict genetic programming from finding near optimal dispatching rules. This paper develops a new hybrid genetic programming algorithm for dynamic job shop scheduling based on a new representation, a new local search heuristic, and efficient fitness evaluators. Experiments show that the new method is effective regarding the quality of evolved rules. Moreover, evolved rules are also significantly smaller and contain more relevant attributes.
An Overview of R in Health Decision Sciences.
Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam
2017-10-01
As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.
NASA Astrophysics Data System (ADS)
Takahashi, Riku; Wu, Zi Liang; Arifuzzaman, Md; Nonoyama, Takayuki; Nakajima, Tasuku; Kurokawa, Takayuki; Gong, Jian Ping
2014-08-01
Biomacromolecules usually form complex superstructures in natural biotissues, such as different alignments of collagen fibres in articular cartilages, for multifunctionalities. Inspired by nature, there are efforts towards developing multiscale ordered structures in hydrogels (recognized as one of the best candidates of soft biotissues). However, creating complex superstructures in gels are hardly realized because of the absence of effective approaches to control the localized molecular orientation. Here we introduce a method to create various superstructures of rigid polyanions in polycationic hydrogels. The control of localized orientation of rigid molecules, which are sensitive to the internal stress field of the gel, is achieved by tuning the swelling mismatch between masked and unmasked regions of the photolithographic patterned gel. Furthermore, we develop a double network structure to toughen the hydrogels with programmed superstructures, which deform reversibly under large strain. This work presents a promising pathway to develop superstructures in hydrogels and should shed light on designing biomimetic materials with intricate molecular alignments.
Localization of diffusion sources in complex networks with sparse observations
NASA Astrophysics Data System (ADS)
Hu, Zhao-Long; Shen, Zhesi; Tang, Chang-Bing; Xie, Bin-Bin; Lu, Jian-Feng
2018-04-01
Locating sources in a large network is of paramount importance to reduce the spreading of disruptive behavior. Based on the backward diffusion-based method and integer programming, we propose an efficient approach to locate sources in complex networks with limited observers. The results on model networks and empirical networks demonstrate that, for a certain fraction of observers, the accuracy of our method for source localization will improve as the increase of network size. Besides, compared with the previous method (the maximum-minimum method), the performance of our method is much better with a small fraction of observers, especially in heterogeneous networks. Furthermore, our method is more robust against noise environments and strategies of choosing observers.
Managing System of Systems Requirements with a Requirements Screening Group
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald R. Barden
2012-07-01
Figuring out an effective and efficient way to manage not only your Requirement’s Baseline, but also the development of all your individual requirements during a Program’s/Project’s Conceptual and Development Life Cycle Stages can be both daunting and difficult. This is especially so when you are dealing with a complex and large System of Systems (SoS) Program with potentially thousands and thousands of Top Level Requirements as well as an equal number of lower level System, Subsystem and Configuration Item requirements that need to be managed. This task is made even more overwhelming when you have to add in integration withmore » multiple requirements’ development teams (e.g., Integrated Product Development Teams (IPTs)) and/or numerous System/Subsystem Design Teams. One solution for tackling this difficult activity on a recent large System of Systems Program was to develop and make use of a Requirements Screening Group (RSG). This group is essentially a Team made up of co-chairs from the various Stakeholders with an interest in the Program of record that are enabled and accountable for Requirements Development on the Program/Project. The RSG co-chairs, often with the help of individual support team, work together as a Program Board to monitor, make decisions on, and provide guidance on all Requirements Development activities during the Conceptual and Development Life Cycle Stages of a Program/Project. In addition, the RSG can establish and maintain the Requirements Baseline, monitor and enforce requirements traceability across the entire Program, and work with other elements of the Program/Project to ensure integration and coordination.« less
Unique barriers and needs in weight management for obese women with fibromyalgia.
Craft, Jennifer M; Ridgeway, Jennifer L; Vickers, Kristin S; Hathaway, Julie C; Vincent, Ann; Oh, Terry H
2015-01-01
The aim of this study was to identify barriers, needs, and preferences of weight management intervention for women with fibromyalgia (FM). Obesity appears in higher rates in women with fibromyalgia compared to the population at large, and no study to date has taken a qualitative approach to better understand how these women view weight management in relation to their disease and vice versa. We designed a qualitative interview study with women patients with FM and obesity. Women (N = 15) were recruited by their participation in a fibromyalgia treatment program (FTP) within the year prior. The women approached for the study met the following inclusion criteria: confirmed diagnosis of FM, age between 30 and 60 years (M = 51 ± 6.27), and body mass index (BMI) ≥ 30 (M = 37.88 ± 4.87). Patients completed questionnaire data prior to their participation in focus groups (N = 3), including weight loss history, physical activity data, the Revised Fibromyalgia Impact Questionnaire (FIQR), and the Patient Health Questionnaire 9-item (PHQ-9). Three focus group interviews were conducted to collect qualitative data. Consistent themes were revealed within and between groups. Patients expressed the complex relationships between FM symptoms, daily responsibilities, and weight management. Weight was viewed as an emotionally laden topic requiring compassionate delivery of programming from an empathetic leader who is knowledgeable about fibromyalgia. Patients view themselves as complex and different, requiring a specifically tailored weight management program for women with FM. Women with FM identify unique barriers to weight management, including the complex interrelationships between symptoms of FM and health behaviors, such as diet and exercise. They prefer a weight management program for women with FM that consists of an in-person, group-based approach with a leader but are open to a tailored conventional weight management program. Feasibility may be one of the biggest barriers to such a program both from an institutional and individual perspective. Copyright © 2015 Elsevier Inc. All rights reserved.
Macromolecular Origins of Harmonics Higher than the Third in Large-Amplitude Oscillatory Shear Flow
NASA Astrophysics Data System (ADS)
Giacomin, Alan; Jbara, Layal; Gilbert, Peter; Chemical Engineering Department Team
2016-11-01
In 1935, Andrew Gemant conceived of the complex viscosity, a rheological material function measured by "jiggling" an elastic liquid in oscillatory shear. This test reveals information about both the viscous and elastic properties of the liquid, and about how these properties depend on frequency. The test gained popularity with chemists when John Ferry perfected instruments for measuring both the real and imaginary parts of the complex viscosity. In 1958, Cox and Merz discovered that the steady shear viscosity curve was easily deduced from the magnitude of the complex viscosity, and today oscillatory shear is the single most popular rheological property measurement. With oscillatory shear, we can control two things: the frequency (Deborah number) and the shear rate amplitude (Weissenberg number). When the Weissenberg number is large, the elastic liquids respond with a shear stress over a series of odd-multiples of the test frequency. In this lecture we will explore recent attempts to deepen our understand of the physics of these higher harmonics, including especially harmonics higher than the third. Canada Research Chairs program of the Government of Canada for the Natural Sciences and Engineering Research Council of Canada (NSERC) Tier 1 Canada Research Chair in Rheology.
Intelligent systems engineering methodology
NASA Technical Reports Server (NTRS)
Fouse, Scott
1990-01-01
An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.
Discovery of a Bright Equatorial Storm on Neptune
NASA Astrophysics Data System (ADS)
Molter, E. M.; De Pater, I.; Alvarez, C.; Tollefson, J.; Luszcz-Cook, S.
2017-12-01
Images of Neptune, taken with the NIRC2 instrument during testing of the new Twilight Zone observing program at Keck Observatory, revealed an extremely large bright storm system near Neptune's equator. The storm complex is ≈9,000 km across and brightened considerably between June 26 and July 2. Historically, very bright clouds have occasionally been seen on Neptune, but always in the midlatitude regions between ≈15° and ≈60° North or South. Voyager and HST observations have shown that cloud features large enough to dominate near-IR photometry are often "companion" clouds of dark anti-cyclonic vortices similar to Jupiter's Great Red Spot, interpreted as orographic clouds. In the past such clouds and their coincident dark vortices often persisted for one up to several years. However, the cloud complex we detect is unique: never before has a bright cloud been seen at, or so close to, the equator. The discovery points to a drastic departure in the dynamics of Neptune's atmosphere from what has been observed for the past several decades. Detections of the complex in multiple NIRC2 filters allows radiative transfer modeling to constrain the cloud's altitude and vertical extent.
NASA Technical Reports Server (NTRS)
Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.
2012-01-01
This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.
Local wavelet transform: a cost-efficient custom processor for space image compression
NASA Astrophysics Data System (ADS)
Masschelein, Bart; Bormans, Jan G.; Lafruit, Gauthier
2002-11-01
Thanks to its intrinsic scalability features, the wavelet transform has become increasingly popular as decorrelator in image compression applications. Throuhgput, memory requirements and complexity are important parameters when developing hardware image compression modules. An implementation of the classical, global wavelet transform requires large memory sizes and implies a large latency between the availability of the input image and the production of minimal data entities for entropy coding. Image tiling methods, as proposed by JPEG2000, reduce the memory sizes and the latency, but inevitably introduce image artefacts. The Local Wavelet Transform (LWT), presented in this paper, is a low-complexity wavelet transform architecture using a block-based processing that results in the same transformed images as those obtained by the global wavelet transform. The architecture minimizes the processing latency with a limited amount of memory. Moreover, as the LWT is an instruction-based custom processor, it can be programmed for specific tasks, such as push-broom processing of infinite-length satelite images. The features of the LWT makes it appropriate for use in space image compression, where high throughput, low memory sizes, low complexity, low power and push-broom processing are important requirements.
Reid Ponte, Patricia; Hayman, Laura L; Berry, Donna L; Cooley, Mary E
2015-01-01
The University of Massachusetts Boston and Dana-Farber/Harvard Cancer Center joined forces in 2009 to create a Postdoctoral Nursing Research Fellowship in Cancer and Health Disparities. In combining the resources of a large university and a research-intensive service institution, the postdoctoral program provides a new model for preparing nurse scientists to conduct independent research that advances nursing knowledge and interdisciplinary understanding of complex health issues. The multifaceted program consists of educational programming, research training, and career planning components. Additionally, each fellow is assigned a nurse scientist mentor and interdisciplinary co-mentor. The mentors support the fellows with scholarly activities and research training and help the fellows craft individualized career plans, including proposals for postfellowship career development research. In this article, the postdoctoral program leaders describe the program structure, strategies used to recruit minority and nonminority candidates, and data describing program outcomes and share lessons learned and recommendations for organizations that may be interested in establishing similar postdoctoral fellowships at their institutions. Copyright © 2015 Elsevier Inc. All rights reserved.
Variable-Speed Power-Turbine for the Large Civil Tilt Rotor
NASA Technical Reports Server (NTRS)
Suchezky, Mark; Cruzen, G. Scott
2012-01-01
Turbine design concepts were studied for application to a large civil tiltrotor transport aircraft. The concepts addressed the need for high turbine efficiency across the broad 2:1 turbine operating speed range representative of the notional mission for the aircraft. The study focused on tailoring basic turbine aerodynamic design design parameters to avoid the need for complex, heavy, and expensive variable geometry features. The results of the study showed that good turbine performance can be achieved across the design speed range if the design focuses on tailoring the aerodynamics for good tolerance to large swings in incidence, as opposed to optimizing for best performance at the long range cruise design point. A rig design configuration and program plan are suggested for a dedicated experiment to validate the proposed approach.
An interactive graphics program for manipulation and display of panel method geometry
NASA Technical Reports Server (NTRS)
Hall, J. F.; Neuhart, D. H.; Walkley, K. B.
1983-01-01
Modern aerodynamic panel methods that handle large, complex geometries have made evident the need to interactively manipulate, modify, and view such configurations. With this purpose in mind, the GEOM program was developed. It is a menu driven, interactive program that uses the Tektronix PLOT 10 graphics software to display geometry configurations which are characterized by an abutting set of networks. These networks are composed of quadrilateral panels which are described by the coordinates of their corners. GEOM is divided into fourteen executive controlled functions. These functions are used to build configurations, scale and rotate networks, transpose networks defining M and N lines, graphically display selected networks, join and split networks, create wake networks, produce symmetric images of networks, repanel and rename networks, display configuration cross sections, and output network geometry in two formats. A data base management system is used to facilitate data transfers in this program. A sample session illustrating various capabilities of the code is included as a guide to program operation.
Generic concept to program the time domain of self-assemblies with a self-regulation mechanism.
Heuser, Thomas; Steppert, Ann-Kathrin; Lopez, Catalina Molano; Zhu, Baolei; Walther, Andreas
2015-04-08
Nature regulates complex structures in space and time via feedback loops, kinetically controlled transformations, and under energy dissipation to allow non-equilibrium processes. Although man-made static self-assemblies realize excellent control over hierarchical structures via molecular programming, managing their temporal destiny by self-regulation is a largely unsolved challenge. Herein, we introduce a generic concept to control the time domain by programming the lifetimes of switchable self-assemblies in closed systems. We conceive dormant deactivators that, in combination with fast promoters, enable a unique kinetic balance to establish an autonomously self-regulating, transient pH-state, whose duration can be programmed over orders of magnitude-from minutes to days. Coupling this non-equilibrium state to pH-switchable self-assemblies allows predicting their assembly/disassembly fate in time, similar to a precise self-destruction mechanism. We demonstrate a platform approach by programming self-assembly lifetimes of block copolymers, nanoparticles, and peptides, enabling dynamic materials with a self-regulation functionality.
Integrated digital flight-control system for the space shuttle orbiter
NASA Technical Reports Server (NTRS)
1973-01-01
The integrated digital flight control system is presented which provides rotational and translational control of the space shuttle orbiter in all phases of flight: from launch ascent through orbit to entry and touchdown, and during powered horizontal flights. The program provides a versatile control system structure while maintaining uniform communications with other programs, sensors, and control effectors by using an executive routine/functional subroutine format. The program reads all external variables at a single point, copies them into its dedicated storage, and then calls the required subroutines in the proper sequence. As a result, the flight control program is largely independent of other programs in the GN&C computer complex and is equally insensitive to the characteristics of the processor configuration. The integrated structure of the control system and the DFCS executive routine which embodies that structure are described along with the input and output. The specific estimation and control algorithms used in the various mission phases are given.
Intrasystem Analysis Program (IAP) code summaries
NASA Astrophysics Data System (ADS)
Dobmeier, J. J.; Drozd, A. L. S.; Surace, J. A.
1983-05-01
This report contains detailed descriptions and capabilities of the codes that comprise the Intrasystem Analysis Program. The four codes are: Intrasystem Electromagnetic Compatibility Analysis Program (IEMCAP), General Electromagnetic Model for the Analysis of Complex Systems (GEMACS), Nonlinear Circuit Analysis Program (NCAP), and Wire Coupling Prediction Models (WIRE). IEMCAP is used for computer-aided evaluation of electromagnetic compatibility (ECM) at all stages of an Air Force system's life cycle, applicable to aircraft, space/missile, and ground-based systems. GEMACS utilizes a Method of Moments (MOM) formalism with the Electric Field Integral Equation (EFIE) for the solution of electromagnetic radiation and scattering problems. The code employs both full matrix decomposition and Banded Matrix Iteration solution techniques and is expressly designed for large problems. NCAP is a circuit analysis code which uses the Volterra approach to solve for the transfer functions and node voltage of weakly nonlinear circuits. The Wire Programs deal with the Application of Multiconductor Transmission Line Theory to the Prediction of Cable Coupling for specific classes of problems.
Building the team for team science
Read, Emily K.; O'Rourke, M.; Hong, G. S.; Hanson, P. C.; Winslow, Luke A.; Crowley, S.; Brewer, C. A.; Weathers, K. C.
2016-01-01
The ability to effectively exchange information and develop trusting, collaborative relationships across disciplinary boundaries is essential for 21st century scientists charged with solving complex and large-scale societal and environmental challenges, yet these communication skills are rarely taught. Here, we describe an adaptable training program designed to increase the capacity of scientists to engage in information exchange and relationship development in team science settings. A pilot of the program, developed by a leader in ecological network science, the Global Lake Ecological Observatory Network (GLEON), indicates that the training program resulted in improvement in early career scientists’ confidence in team-based network science collaborations within and outside of the program. Fellows in the program navigated human-network challenges, expanded communication skills, and improved their ability to build professional relationships, all in the context of producing collaborative scientific outcomes. Here, we describe the rationale for key communication training elements and provide evidence that such training is effective in building essential team science skills.
Students' explanations in complex learning of disciplinary programming
NASA Astrophysics Data System (ADS)
Vieira, Camilo
Computational Science and Engineering (CSE) has been denominated as the third pillar of science and as a set of important skills to solve the problems of a global society. Along with the theoretical and the experimental approaches, computation offers a third alternative to solve complex problems that require processing large amounts of data, or representing complex phenomena that are not easy to experiment with. Despite the relevance of CSE, current professionals and scientists are not well prepared to take advantage of this set of tools and methods. Computation is usually taught in an isolated way from engineering disciplines, and therefore, engineers do not know how to exploit CSE affordances. This dissertation intends to introduce computational tools and methods contextualized within the Materials Science and Engineering curriculum. Considering that learning how to program is a complex task, the dissertation explores effective pedagogical practices that can support student disciplinary and computational learning. Two case studies will be evaluated to identify the characteristics of effective worked examples in the context of CSE. Specifically, this dissertation explores students explanations of these worked examples in two engineering courses with different levels of transparency: a programming course in materials science and engineering glass box and a thermodynamics course involving computational representations black box. Results from this study suggest that students benefit in different ways from writing in-code comments. These benefits include but are not limited to: connecting xv individual lines of code to the overall problem, getting familiar with the syntax, learning effective algorithm design strategies, and connecting computation with their discipline. Students in the glass box context generate higher quality explanations than students in the black box context. These explanations are related to students prior experiences. Specifically, students with low ability to do programming engage in a more thorough explanation process than students with high ability. This dissertation concludes proposing an adaptation to the instructional principles of worked-examples for the context of CSE education.
Commercial Complexity and Local and Global Involvement in Programs: Effects on Viewer Responses.
ERIC Educational Resources Information Center
Oberman, Heiko; Thorson, Esther
A study investigated the effects of local (momentary) and global (whole program) involvement in program context and the effects of message complexity on the retention of television commercials. Sixteen commercials, categorized as simple video/simple audio through complex video/complex audio were edited into two globally high- and two globally…
Solving large mixed linear models using preconditioned conjugate gradient iteration.
Strandén, I; Lidauer, M
1999-12-01
Continuous evaluation of dairy cattle with a random regression test-day model requires a fast solving method and algorithm. A new computing technique feasible in Jacobi and conjugate gradient based iterative methods using iteration on data is presented. In the new computing technique, the calculations in multiplication of a vector by a matrix were recorded to three steps instead of the commonly used two steps. The three-step method was implemented in a general mixed linear model program that used preconditioned conjugate gradient iteration. Performance of this program in comparison to other general solving programs was assessed via estimation of breeding values using univariate, multivariate, and random regression test-day models. Central processing unit time per iteration with the new three-step technique was, at best, one-third that needed with the old technique. Performance was best with the test-day model, which was the largest and most complex model used. The new program did well in comparison to other general software. Programs keeping the mixed model equations in random access memory required at least 20 and 435% more time to solve the univariate and multivariate animal models, respectively. Computations of the second best iteration on data took approximately three and five times longer for the animal and test-day models, respectively, than did the new program. Good performance was due to fast computing time per iteration and quick convergence to the final solutions. Use of preconditioned conjugate gradient based methods in solving large breeding value problems is supported by our findings.
Workspace Program for Complex-Number Arithmetic
NASA Technical Reports Server (NTRS)
Patrick, M. C.; Howell, Leonard W., Jr.
1986-01-01
COMPLEX is workspace program designed to empower APL with complexnumber capabilities. Complex-variable methods provide analytical tools invaluable for applications in mathematics, science, and engineering. COMPLEX written in APL.
Strategies for responding to RAC requests electronically.
Schramm, Michael
2012-04-01
Providers that would like to respond to complex RAC reviews electronically should consider three strategies: Invest in an EHR software package or a high-powered scanner that can quickly scan large amounts of paper. Implement an audit software platform that will allow providers to manage the entire audit process in one place. Use a CONNECT-compatible gateway capable of accessing the Nationwide Health Information Network (the network on which the electronic submission of medical documentation program runs).
Development and application of optimum sensitivity analysis of structures
NASA Technical Reports Server (NTRS)
Barthelemy, J. F. M.; Hallauer, W. L., Jr.
1984-01-01
The research focused on developing an algorithm applying optimum sensitivity analysis for multilevel optimization. The research efforts have been devoted to assisting NASA Langley's Interdisciplinary Research Office (IRO) in the development of a mature methodology for a multilevel approach to the design of complex (large and multidisciplinary) engineering systems. An effort was undertaken to identify promising multilevel optimization algorithms. In the current reporting period, the computer program generating baseline single level solutions was completed and tested out.
Apollo Program Management, Kennedy Space Center, Florida. Volume 4
NASA Technical Reports Server (NTRS)
1968-01-01
The evolution of the Kennedy Space Center as the launch organization for Apollo/ Saturn V involved the concurrent solution of numerous complex problems. A significant increase in manpower was involved. Large and complex checkout and launch facilities were to be designed and constructed. Expansion of operational capabilities required the establishment and integration of a Government-Contractor operational team. From an initial cadre of approximately 200 civil service personnel of the Army Ballistic Missile Agency, transferred to NASA in 1960 following its establishment, expansion to the present civil service level of 2,900 occurred in the last seven years. Established within NASA as a directorate of the Marshall Space Flight Center, KSC achieved center status in 1962. With its designation as a Center, KSC accomplished the development and staffing of an organization that could perform procurement, resources, financial, and other management requirements formerly provided by the parent organization. In addition to continuing launch operations for established programs, KSC undertook the design and construction of large, new, and unique launch facilities for Apollo/Saturn V. With the expansion of the civil service work force, KSC integrated contractor organizations employing 23,000 personnel at the Center to perform specific operational and support missions under the technical supervision and observation of the Government team. The management techniques, organizational concepts, and continuing efforts utilized to meet the Apollo goals and challenges are discussed in this document.
Context Switching with Multiple Register Windows: A RISC Performance Study
NASA Technical Reports Server (NTRS)
Konsek, Marion B.; Reed, Daniel A.; Watcharawittayakul, Wittaya
1987-01-01
Although previous studies have shown that a large file of overlapping register windows can greatly reduce procedure call/return overhead, the effects of register windows in a multiprogramming environment are poorly understood. This paper investigates the performance of multiprogrammed, reduced instruction set computers (RISCs) as a function of window management strategy. Using an analytic model that reflects context switch and procedure call overheads, we analyze the performance of simple, linearly self-recursive programs. For more complex programs, we present the results of a simulation study. These studies show that a simple strategy that saves all windows prior to a context switch, but restores only a single window following a context switch, performs near optimally.
EMGAN: A computer program for time and frequency domain reduction of electromyographic data
NASA Technical Reports Server (NTRS)
Hursta, W. N.
1975-01-01
An experiment in electromyography utilizing surface electrode techniques was developed for the Apollo-Soyuz test project. This report describes the computer program, EMGAN, which was written to provide first order data reduction for the experiment. EMG signals are produced by the membrane depolarization of muscle fibers during a muscle contraction. Surface electrodes detect a spatially summated signal from a large number of muscle fibers commonly called an interference pattern. An interference pattern is usually so complex that analysis through signal morphology is extremely difficult if not impossible. It has become common to process EMG interference patterns in the frequency domain. Muscle fatigue and certain myopathic conditions are recognized through changes in muscle frequency spectra.
The dyskerin ribonucleoprotein complex as an OCT4/SOX2 coactivator in embryonic stem cells
Fong, Yick W; Ho, Jaclyn J; Inouye, Carla; Tjian, Robert
2014-01-01
Acquisition of pluripotency is driven largely at the transcriptional level by activators OCT4, SOX2, and NANOG that must in turn cooperate with diverse coactivators to execute stem cell-specific gene expression programs. Using a biochemically defined in vitro transcription system that mediates OCT4/SOX2 and coactivator-dependent transcription of the Nanog gene, we report the purification and identification of the dyskerin (DKC1) ribonucleoprotein complex as an OCT4/SOX2 coactivator whose activity appears to be modulated by a subset of associated small nucleolar RNAs (snoRNAs). The DKC1 complex occupies enhancers and regulates the expression of key pluripotency genes critical for self-renewal in embryonic stem (ES) cells. Depletion of DKC1 in fibroblasts significantly decreased the efficiency of induced pluripotent stem (iPS) cell generation. This study thus reveals an unanticipated transcriptional role of the DKC1 complex in stem cell maintenance and somatic cell reprogramming. DOI: http://dx.doi.org/10.7554/eLife.03573.001 PMID:25407680
The impact of RAC audits on US hospitals.
Harrison, Jeffrey P; Barksdale, Rachel M
2013-01-01
The Medicare Prescription Drug, Improvement, and Modernization Act of 2003 (MMA) authorized a three-year demonstration program using recovery audit contractors (RACs) to identify and correct improper payments in the Medicare Fee-For-Service program. More recently, Section 6411 of the Affordable Care Act (ACA) expanded the RAC program to include the Medicaid program. This shows the Cent ers for Medicare & Medicaid Services (CMS) believe RAC audits are a cost-effective method to ensure health care providers are paid correctly and thereby protect the Medicare Trust Fund. RAC audits are highly complex and require significant manpower to handle the large volume of requests received during a short period of time. Additionally, the RAC audit appeal process is complicated and requires a high level of technical expertise. The demonstration project found that RAC audits resulted in sizeable amounts of overpayments collected ("take-backs") from many providers. This research study assesses the potential impact of the RAC audit program on US acute care hospitals. Data obtained from CMS show that RAC overpayments collected for FY 2010 were $75.4 million, increased to $797.4 million in FY 2011, and increased to $986.2 million in the first six months of FY 2012. According to the American Hospital Association (AHA) RACTrac audit survey, the vast majority of these collections represent complex denials where hospitals are required to provide medical record documents in support of their billed claims. This study found that the RAC audit program collections are increasing significantly over time. As a result, these collections are having a significant negative impact on the profitability of US hospitals.
Pezzulo, G; Levin, M
2015-12-01
A major goal of regenerative medicine and bioengineering is the regeneration of complex organs, such as limbs, and the capability to create artificial constructs (so-called biobots) with defined morphologies and robust self-repair capabilities. Developmental biology presents remarkable examples of systems that self-assemble and regenerate complex structures toward their correct shape despite significant perturbations. A fundamental challenge is to translate progress in molecular genetics into control of large-scale organismal anatomy, and the field is still searching for an appropriate theoretical paradigm for facilitating control of pattern homeostasis. However, computational neuroscience provides many examples in which cell networks - brains - store memories (e.g., of geometric configurations, rules, and patterns) and coordinate their activity towards proximal and distant goals. In this Perspective, we propose that programming large-scale morphogenesis requires exploiting the information processing by which cellular structures work toward specific shapes. In non-neural cells, as in the brain, bioelectric signaling implements information processing, decision-making, and memory in regulating pattern and its remodeling. Thus, approaches used in computational neuroscience to understand goal-seeking neural systems offer a toolbox of techniques to model and control regenerative pattern formation. Here, we review recent data on developmental bioelectricity as a regulator of patterning, and propose that target morphology could be encoded within tissues as a kind of memory, using the same molecular mechanisms and algorithms so successfully exploited by the brain. We highlight the next steps of an unconventional research program, which may allow top-down control of growth and form for numerous applications in regenerative medicine and synthetic bioengineering.
NASA Astrophysics Data System (ADS)
Pantale, O.; Caperaa, S.; Rakotomalala, R.
2004-07-01
During the last 50 years, the development of better numerical methods and more powerful computers has been a major enterprise for the scientific community. In the same time, the finite element method has become a widely used tool for researchers and engineers. Recent advances in computational software have made possible to solve more physical and complex problems such as coupled problems, nonlinearities, high strain and high-strain rate problems. In this field, an accurate analysis of large deformation inelastic problems occurring in metal-forming or impact simulations is extremely important as a consequence of high amount of plastic flow. In this presentation, the object-oriented implementation, using the C++ language, of an explicit finite element code called DynELA is presented. The object-oriented programming (OOP) leads to better-structured codes for the finite element method and facilitates the development, the maintainability and the expandability of such codes. The most significant advantage of OOP is in the modeling of complex physical systems such as deformation processing where the overall complex problem is partitioned in individual sub-problems based on physical, mathematical or geometric reasoning. We first focus on the advantages of OOP for the development of scientific programs. Specific aspects of OOP, such as the inheritance mechanism, the operators overload procedure or the use of template classes are detailed. Then we present the approach used for the development of our finite element code through the presentation of the kinematics, conservative and constitutive laws and their respective implementation in C++. Finally, the efficiency and accuracy of our finite element program are investigated using a number of benchmark tests relative to metal forming and impact simulations.
Using the NASTRAN Thermal Analyzer to simulate a flight scientific instrument package
NASA Technical Reports Server (NTRS)
Lee, H.-P.; Jackson, C. E., Jr.
1974-01-01
The NASTRAN Thermal Analyzer has proven to be a unique and useful tool for thermal analyses involving large and complex structures where small, thermally induced deformations are critical. Among its major advantages are direct grid point-to-grid point compatibility with large structural models; plots of the model that may be generated for both conduction and boundary elements; versatility of applying transient thermal loads especially to repeat orbital cycles; on-line printer plotting of temperatures and rate of temperature changes as a function of time; and direct matrix input to solve linear differential equations on-line. These features provide a flexibility far beyond that available in most finite-difference thermal analysis computer programs.
NASA Astrophysics Data System (ADS)
Levit, Creon; Gazis, P.
2006-06-01
The graphics processing units (GPUs) built in to all professional desktop and laptop computers currently on the market are capable of transforming, filtering, and rendering hundreds of millions of points per second. We present a prototype open-source cross-platform (windows, linux, Apple OSX) application which leverages some of the power latent in the GPU to enable smooth interactive exploration and analysis of large high-dimensional data using a variety of classical and recent techniques. The targeted application area is the interactive analysis of complex, multivariate space science and astrophysics data sets, with dimensionalities that may surpass 100 and sample sizes that may exceed 10^6-10^8.
Strategies for concurrent processing of complex algorithms in data driven architectures
NASA Technical Reports Server (NTRS)
Stoughton, John W.; Mielke, Roland R.
1988-01-01
The purpose is to document research to develop strategies for concurrent processing of complex algorithms in data driven architectures. The problem domain consists of decision-free algorithms having large-grained, computationally complex primitive operations. Such are often found in signal processing and control applications. The anticipated multiprocessor environment is a data flow architecture containing between two and twenty computing elements. Each computing element is a processor having local program memory, and which communicates with a common global data memory. A new graph theoretic model called ATAMM which establishes rules for relating a decomposed algorithm to its execution in a data flow architecture is presented. The ATAMM model is used to determine strategies to achieve optimum time performance and to develop a system diagnostic software tool. In addition, preliminary work on a new multiprocessor operating system based on the ATAMM specifications is described.
Attention-deficit hyperactivity disorder (ADHD) and tuberous sclerosis complex.
D'Agati, Elisa; Moavero, Romina; Cerminara, Caterina; Curatolo, Paolo
2009-10-01
The neurobiological basis of attention-deficit hyperactivity disorder (ADHD) in tuberous sclerosis complex is still largely unknown. Cortical tubers may disrupt several brain networks that control different types of attention. Frontal lobe dysfunction due to seizures or epileptiform electroencephalographic discharges may perturb the development of brain systems that underpin attentional and hyperactive functions during a critical early stage of brain maturation. Comorbidity of attention-deficit hyperactivity disorder (ADHD) with mental retardation and autism spectrum disorders is frequent in children with tuberous sclerosis. Attention-deficit hyperactivity disorder (ADHD) may also reflect a direct effect of the abnormal genetic program. Treatment of children with tuberous sclerosis complex with combined symptoms of attention-deficit hyperactivity disorder (ADHD) and epilepsy may represent a challenge for clinicians, because antiepileptic therapy and drugs used to treat attention-deficit hyperactivity disorder (ADHD) may aggravate the clinical picture of each other.
Technology demonstration of starshade manufacturing for NASA's Exoplanet mission program
NASA Astrophysics Data System (ADS)
Kasdin, N. J.; Lisman, D.; Shaklan, S.; Thomson, M.; Cady, E.; Martin, S.; Marchen, L.; Vanderbei, R. J.; Macintosh, B.; Rudd, R. E.; Savransky, D.; Mikula, J.; Lynch, D.
2012-09-01
It is likely that the coming decade will see the development of a large visible light telescope with enabling technology for imaging exosolar Earthlike planets in the habitable zone of nearby stars. One such technology utilizes an external occulter, a satellite flying far from the telescope and employing a large screen, or starshade, to suppress the incoming starlight suffciently for detecting and characterizing exoplanets. This trades the added complexity of building the precisely shaped starshade and flying it in formation against simplifications in the telescope since extremely precise wavefront control is no longer necessary. In this paper we present the results of our project to design, manufacture, and measure a prototype occulter petal as part of NASA's first Technology Development for Exoplanet Missions program. We describe the mechanical design of the starshade and petal, the precision manufacturing tolerances, and the metrology approach. We demonstrate that the prototype petal meets the requirements and is consistent with a full-size occulter achieving better than 10-10 contrast.
High-Performance Signal Detection for Adverse Drug Events using MapReduce Paradigm.
Fan, Kai; Sun, Xingzhi; Tao, Ying; Xu, Linhao; Wang, Chen; Mao, Xianling; Peng, Bo; Pan, Yue
2010-11-13
Post-marketing pharmacovigilance is important for public health, as many Adverse Drug Events (ADEs) are unknown when those drugs were approved for marketing. However, due to the large number of reported drugs and drug combinations, detecting ADE signals by mining these reports is becoming a challenging task in terms of computational complexity. Recently, a parallel programming model, MapReduce has been introduced by Google to support large-scale data intensive applications. In this study, we proposed a MapReduce-based algorithm, for common ADE detection approach, Proportional Reporting Ratio (PRR), and tested it in mining spontaneous ADE reports from FDA. The purpose is to investigate the possibility of using MapReduce principle to speed up biomedical data mining tasks using this pharmacovigilance case as one specific example. The results demonstrated that MapReduce programming model could improve the performance of common signal detection algorithm for pharmacovigilance in a distributed computation environment at approximately liner speedup rates.
An efficient solver for large structured eigenvalue problems in relativistic quantum chemistry
NASA Astrophysics Data System (ADS)
Shiozaki, Toru
2017-01-01
We report an efficient program for computing the eigenvalues and symmetry-adapted eigenvectors of very large quaternionic (or Hermitian skew-Hamiltonian) matrices, using which structure-preserving diagonalisation of matrices of dimension N > 10, 000 is now routine on a single computer node. Such matrices appear frequently in relativistic quantum chemistry owing to the time-reversal symmetry. The implementation is based on a blocked version of the Paige-Van Loan algorithm, which allows us to use the Level 3 BLAS subroutines for most of the computations. Taking advantage of the symmetry, the program is faster by up to a factor of 2 than state-of-the-art implementations of complex Hermitian diagonalisation; diagonalising a 12, 800 × 12, 800 matrix took 42.8 (9.5) and 85.6 (12.6) minutes with 1 CPU core (16 CPU cores) using our symmetry-adapted solver and Intel Math Kernel Library's ZHEEV that is not structure-preserving, respectively. The source code is publicly available under the FreeBSD licence.
NASA Astrophysics Data System (ADS)
Petrila, S.; Brabie, G.; Chirita, B.
2016-08-01
The analysis performed on manufacturing flows within industrial enterprises producing hydrostatic components twos made on a number of factors that influence smooth running of production such: distance between pieces, waiting time from one surgery to another; time achievement of setups on CNC machines; tool changing in case of a large number of operators and manufacturing complexity of large files [2]. To optimize the manufacturing flow it was used the software Tecnomatix. This software represents a complete portfolio of manufacturing solutions digital manufactured by Siemens. It provides innovation by linking all production methods of a product from process design, process simulation, validation and ending the manufacturing process. Among its many capabilities to create a wide range of simulations, the program offers various demonstrations regarding the behavior manufacturing cycles. This program allows the simulation and optimization of production systems and processes in several areas such as: car suppliers, production of industrial equipment; electronics manufacturing, design and production of aerospace and defense parts.
Programming Models for Concurrency and Real-Time
NASA Astrophysics Data System (ADS)
Vitek, Jan
Modern real-time applications are increasingly large, complex and concurrent systems which must meet stringent performance and predictability requirements. Programming those systems require fundamental advances in programming languages and runtime systems. This talk presents our work on Flexotasks, a programming model for concurrent, real-time systems inspired by stream-processing and concurrent active objects. Some of the key innovations in Flexotasks are that it support both real-time garbage collection and region-based memory with an ownership type system for static safety. Communication between tasks is performed by channels with a linear type discipline to avoid copying messages, and by a non-blocking transactional memory facility. We have evaluated our model empirically within two distinct implementations, one based on Purdue’s Ovm research virtual machine framework and the other on Websphere, IBM’s production real-time virtual machine. We have written a number of small programs, as well as a 30 KLOC avionics collision detector application. We show that Flexotasks are capable of executing periodic threads at 10 KHz with a standard deviation of 1.2us and have performance competitive with hand coded C programs.
Computer program for analysis of coupled-cavity traveling wave tubes
NASA Technical Reports Server (NTRS)
Connolly, D. J.; Omalley, T. A.
1977-01-01
A flexible, accurate, large signal computer program was developed for the design of coupled cavity traveling wave tubes. The program is written in FORTRAN IV for an IBM 360/67 time sharing system. The beam is described by a disk model and the slow wave structure by a sequence of cavities, or cells. The computational approach is arranged so that each cavity may have geometrical or electrical parameters different from those of its neighbors. This allows the program user to simulate a tube of almost arbitrary complexity. Input and output couplers, severs, complicated velocity tapers, and other features peculiar to one or a few cavities may be modeled by a correct choice of input data. The beam-wave interaction is handled by an approach in which the radio frequency fields are expanded in solutions to the transverse magnetic wave equation. All significant space harmonics are retained. The program was used to perform a design study of the traveling-wave tube developed for the Communications Technology Satellite. Good agreement was obtained between the predictions of the program and the measured performance of the flight tube.
Ponte, Patricia Reid; Hayman, Laura L; Berry, Donna L; Cooley, Mary E
2016-01-01
The University of Massachusetts Boston and Dana-Farber/Harvard Cancer Center joined forces in 2009 to create a Postdoctoral Nursing Research Fellowship in Cancer and Health Disparities. In combining the resources of a large university and a research-intensive service institution, the postdoctoral program provides a new model for preparing nurse scientists to conduct independent research that advances nursing knowledge and interdisciplinary understanding of complex health issues. The multi-faceted program consists of educational programming, research training, and career planning components. Additionally, each fellow is assigned a nurse scientist mentor and interdisciplinary co-mentor. The mentors support the fellows with scholarly activities and research training and help the fellows craft individualized career plans, including proposals for post-fellowship career development research. In this article, the postdoctoral program leaders describe the program structure, strategies used to recruit minority and non-minority candidates, and data describing program outcomes, and share lessons learned and recommendations for organizations that may be interested in establishing similar postdoctoral fellowships at their institutions. PMID:25771193
Array data extractor (ADE): a LabVIEW program to extract and merge gene array data.
Kurtenbach, Stefan; Kurtenbach, Sarah; Zoidl, Georg
2013-12-01
Large data sets from gene expression array studies are publicly available offering information highly valuable for research across many disciplines ranging from fundamental to clinical research. Highly advanced bioinformatics tools have been made available to researchers, but a demand for user-friendly software allowing researchers to quickly extract expression information for multiple genes from multiple studies persists. Here, we present a user-friendly LabVIEW program to automatically extract gene expression data for a list of genes from multiple normalized microarray datasets. Functionality was tested for 288 class A G protein-coupled receptors (GPCRs) and expression data from 12 studies comparing normal and diseased human hearts. Results confirmed known regulation of a beta 1 adrenergic receptor and further indicate novel research targets. Although existing software allows for complex data analyses, the LabVIEW based program presented here, "Array Data Extractor (ADE)", provides users with a tool to retrieve meaningful information from multiple normalized gene expression datasets in a fast and easy way. Further, the graphical programming language used in LabVIEW allows applying changes to the program without the need of advanced programming knowledge.
Glass sample preparation and performance investigations. [solar x-ray imager
NASA Technical Reports Server (NTRS)
Johnson, R. Barry
1992-01-01
This final report details the work performed under this delivery order from April 1991 through April 1992. The currently available capabilities for integrated optical performance modeling at MSFC for large and complex systems such as AXAF were investigated. The Integrated Structural Modeling (ISM) program developed by Boeing for the U.S. Air Force was obtained and installed on two DECstations 5000 at MSFC. The structural, thermal and optical analysis programs available in ISM were evaluated. As part of the optomechanical engineering activities, technical support was provided in the design of support structure, mirror assembly, filter wheel assembly and material selection for the Solar X-ray Imager (SXI) program. As part of the fabrication activities, a large number of zerodur glass samples were prepared in different sizes and shapes for acid etching, coating and polishing experiments to characterize the subsurface damage and stresses produced by the grinding and polishing operations. Various optical components for AXAF video microscope and the x-ray test facility were also fabricated. A number of glass fabrication and test instruments such as a scatter plate interferometer, a gravity feed saw and some phenolic cutting blades were fabricated, integrated and tested.
NASA Technical Reports Server (NTRS)
Djuth, Frank T.; Elder, John H.; Williams, Kenneth L.
1996-01-01
This research program focused on the construction of several key radio wave diagnostics in support of the HF Active Auroral Ionospheric Research Program (HAARP). Project activities led to the design, development, and fabrication of a variety of hardware units and to the development of several menu-driven software packages for data acquisition and analysis. The principal instrumentation includes an HF (28 MHz) radar system, a VHF (50 MHz) radar system, and a high-speed radar processor consisting of three separable processing units. The processor system supports the HF and VHF radars and is capable of acquiring very detailed data with large incoherent scatter radars. In addition, a tunable HF receiver system having high dynamic range was developed primarily for measurements of stimulated electromagnetic emissions (SEE). A separate processor unit was constructed for the SEE receiver. Finally, a large amount of support instrumentation was developed to accommodate complex field experiments. Overall, the HAARP diagnostics are powerful tools for studying diverse ionospheric modification phenomena. They are also flexible enough to support a host of other missions beyond the scope of HAARP. Many new research programs have been initiated by applying the HAARP diagnostics to studies of natural atmospheric processes.
Challenges for Multilevel Health Disparities Research in a Transdisciplinary Environment
Holmes, John H.; Lehman, Amy; Hade, Erinn; Ferketich, Amy K.; Sarah, Gehlert; Rauscher, Garth H.; Abrams, Judith; Bird, Chloe E.
2008-01-01
Numerous factors play a part in health disparities. Although health disparities are manifested at the level of the individual, other contexts should be considered when investigating the associations of disparities with clinical outcomes. These contexts include families, neighborhoods, social organizations, and healthcare facilities. This paper reports on health disparities research as a multilevel research domain from the perspective of a large national initiative. The Centers for Population Health and Health Disparities (CPHHD) program was established by the NIH to examine the highly dimensional, complex nature of disparities and their effects on health. Because of its inherently transdisciplinary nature, the CPHHD program provides a unique environment in which to perform multilevel health disparities research. During the course of the program, the CPHHD centers have experienced challenges specific to this type of research. The challenges were categorized along three axes: sources of subjects and data, data characteristics, and multilevel analysis and interpretation. The CPHHDs collectively offer a unique example of how these challenges are met; just as importantly, they reveal a broad range of issues that health disparities researchers should consider as they pursue transdisciplinary investigations in this domain, particularly in the context of a large team science initiative. PMID:18619398
The Complexity of Leveraging University Program Change
ERIC Educational Resources Information Center
Crow, Gary M.; Arnold, Noelle Witherspoon; Reed, Cynthia J.; Shoho, Alan R.
2012-01-01
This article identifies four elements of complexity that influence how university educational leadership programs can leverage program change: faculty reward systems, faculty governance, institutional resources, and state-level influence on leadership preparation. Following the discussion of the elements of complexity, the article provides a…
Do rational numbers play a role in selection for stochasticity?
Sinclair, Robert
2014-01-01
When a given tissue must, to be able to perform its various functions, consist of different cell types, each fairly evenly distributed and with specific probabilities, then there are at least two quite different developmental mechanisms which might achieve the desired result. Let us begin with the case of two cell types, and first imagine that the proportion of numbers of cells of these types should be 1:3. Clearly, a regular structure composed of repeating units of four cells, three of which are of the dominant type, will easily satisfy the requirements, and a deterministic mechanism may lend itself to the task. What if, however, the proportion should be 10:33? The same simple, deterministic approach would now require a structure of repeating units of 43 cells, and this certainly seems to require a far more complex and potentially prohibitive deterministic developmental program. Stochastic development, replacing regular units with random distributions of given densities, might not be evolutionarily competitive in comparison with the deterministic program when the proportions should be 1:3, but it has the property that, whatever developmental mechanism underlies it, its complexity does not need to depend very much upon target cell densities at all. We are immediately led to speculate that proportions which correspond to fractions with large denominators (such as the 33 of 10/33) may be more easily achieved by stochastic developmental programs than by deterministic ones, and this is the core of our thesis: that stochastic development may tend to occur more often in cases involving rational numbers with large denominators. To be imprecise: that simple rationality and determinism belong together, as do irrationality and randomness.
Chicxulub Impact Crater and Yucatan Carbonate Platform - PEMEX Oil Exploratory Wells Revisited
NASA Astrophysics Data System (ADS)
Pérez-Drago, G.; Gutierrez-Cirlos, A. G.; Pérez-Cruz, L.; Urrutia-Fucugauchi, J.
2008-12-01
Geophysical oil exploration surveys carried out by PEMEX in the 1940's revealed occurrence of an anomalous pattern of semi-circular concentric gravity anomalies. The Bouguer gravity anomalies covered an extensive area over the flat carbonate platform in the northwestern Yucatan Peninsula; strong density contrasts were suggestive of a buried igneous complex or basement uplift beneath the carbonates, which was referred as the Chicxulub structure. The exploration program carried out afterwards included a drilling program, starting with Chicxulub-1 well in 1952 and comprising eight deep boreholes through the 1970s. An aeromagnetic survey in late 1970's showed high amplitude anomalies in the gravity anomaly central sector. Thus, research showing Chicxulub as a large complex impact crater formed at the K/T boundary was built on the PEMEX decades-long exploration program. Despite frequent reference to PEMEX information and samples, original data and cores have not been openly available for detailed evaluation and integration with results from recent investigations. Core samples largely remain to be analyzed and interpreted in the context of recent marine, aerial and terrestrial geophysical surveys and the drilling/coring projects of UNAM and ICDP. In this presentation we report on the stratigraphy and paleontological data for PEMEX wells: Chicxulub- 1 (1582m), Sacapuc-1 (1530m), Yucatan-6 (1631m), Ticul-1 (3575m) Yucatan-4 (2398m), Yucatan-2 (3474m), Yucatan-5A (3003m) and Yucatan-1 (3221m). These wells remain the deepest drilled in Chicxulub, providing samples of impact lithologies, carbonate sequences and basement, which give information on post- and pre-impact stratigraphy and crystalline basement. We concentrate on stratigraphic columns, lateral correlations and integration with UNAM and ICDP borehole data. Current plans for deep drilling in Chicxulub crater target the peak ring and central sector, with offshore and onshore boreholes proposed to the IODP and ICDP programs.
Artificial intelligence support for scientific model-building
NASA Technical Reports Server (NTRS)
Keller, Richard M.
1992-01-01
Scientific model-building can be a time-intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientific development team to understand. We believe that artificial intelligence techniques can facilitate both the model-building and model-sharing process. In this paper, we overview our effort to build a scientific modeling software tool that aids the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high-level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities.
Functional interactions of HIV-infection and methamphetamine dependence during motor programming.
Archibald, Sarah L; Jacobson, Mark W; Fennema-Notestine, Christine; Ogasawara, Miki; Woods, Steven P; Letendre, Scott; Grant, Igor; Jernigan, Terry L
2012-04-30
Methamphetamine (METH) dependence is frequently comorbid with HIV infection and both have been linked to alterations of brain structure and function. In a previous study, we showed that the brain volume loss characteristic of HIV infection contrasts with METH-related volume increases in striatum and parietal cortex, suggesting distinct neurobiological responses to HIV and METH (Jernigan et al., 2005). Functional magnetic resonance imaging (fMRI) has the potential to reveal functional interactions between the effects of HIV and METH. In the present study, 50 participants were studied in four groups: an HIV+ group, a recently METH-dependent group, a dually affected group, and a group of unaffected community comparison subjects. An fMRI paradigm consisting of motor sequencing tasks of varying levels of complexity was administered to examine blood oxygenation level dependent (BOLD) changes. Within all groups, activity increased significantly with increasing task complexity in large clusters within sensorimotor and parietal cortex, basal ganglia, cerebellum, and cingulate. The task complexity effect was regressed on HIV status, METH status, and the HIV×METH interaction term in a simultaneous multiple regression. HIV was associated with less complexity-related activation in striatum, whereas METH was associated with less complexity-related activation in parietal regions. Significant interaction effects were observed in both cortical and subcortical regions; and, contrary to expectations, the complexity-related activation was less aberrant in dually affected than in single risk participants, in spite of comparable levels of neurocognitive impairment among the clinical groups. Thus, HIV and METH dependence, perhaps through their effects on dopaminergic systems, may have opposing functional effects on neural circuits involved in motor programming. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
PROCOS: computational analysis of protein-protein complexes.
Fink, Florian; Hochrein, Jochen; Wolowski, Vincent; Merkl, Rainer; Gronwald, Wolfram
2011-09-01
One of the main challenges in protein-protein docking is a meaningful evaluation of the many putative solutions. Here we present a program (PROCOS) that calculates a probability-like measure to be native for a given complex. In contrast to scores often used for analyzing complex structures, the calculated probabilities offer the advantage of providing a fixed range of expected values. This will allow, in principle, the comparison of models corresponding to different targets that were solved with the same algorithm. Judgments are based on distributions of properties derived from a large database of native and false complexes. For complex analysis PROCOS uses these property distributions of native and false complexes together with a support vector machine (SVM). PROCOS was compared to the established scoring schemes of ZRANK and DFIRE. Employing a set of experimentally solved native complexes, high probability values above 50% were obtained for 90% of these structures. Next, the performance of PROCOS was tested on the 40 binary targets of the Dockground decoy set, on 14 targets of the RosettaDock decoy set and on 9 targets that participated in the CAPRI scoring evaluation. Again the advantage of using a probability-based scoring system becomes apparent and a reasonable number of near native complexes was found within the top ranked complexes. In conclusion, a novel fully automated method is presented that allows the reliable evaluation of protein-protein complexes. Copyright © 2011 Wiley Periodicals, Inc.
Flight dynamics research for highly agile aircraft
NASA Technical Reports Server (NTRS)
Nguyen, Luat T.
1989-01-01
This paper highlights recent results of research conducted at the NASA Langley Research Center as part of a broad flight dynamics program aimed at developing technology that will enable future combat aircraft to achieve greatly enhanced agility capability at subsonic combat conditions. Studies of advanced control concepts encompassing both propulsive and aerodynamic approaches are reviewed. Dynamic stall phenomena and their potential impact on maneuvering performance and stability are summarized. Finally, issues of mathematical modeling of complex aerodynamics occurring during rapid, large amplitude maneuvers are discussed.
Pre-sporulation stages of Streptomyces differentiation: state-of-the-art and future perspectives
Yagüe, Paula; López-García, Maria T.; Rioseras, Beatriz; Sánchez, Jesús; Manteca, Ángel
2013-01-01
Streptomycetes comprise very important industrial bacteria, producing two-thirds of all clinically relevant secondary metabolites. They are mycelial microorganisms with complex developmental cycles that include programmed cell death (PCD) and sporulation. Industrial fermentations are usually performed in liquid cultures (large bioreactors), conditions in which Streptomyces strains generally do not sporulate, and it was traditionally assumed that there was no differentiation. In this work, we review the current knowledge on Streptomyces pre-sporulation stages of Streptomyces differentiation. PMID:23496097
Otis-Green, Shirley; Sidhu, Rupinder K.; Ferraro, Catherine Del; Ferrell, Betty
2014-01-01
Lung cancer patients and their family caregivers face a wide range of potentially distressing symptoms across the four domains of quality of life. A multi-dimensional approach to addressing these complex concerns with early integration of palliative care has proven beneficial. This article highlights opportunities to integrate social work using a comprehensive quality of life model and a composite patient scenario from a large lung cancer educational intervention National Cancer Institute-funded program project grant. PMID:24797998
Programs Automate Complex Operations Monitoring
NASA Technical Reports Server (NTRS)
2009-01-01
Kennedy Space Center, just off the east coast of Florida on Merritt Island, has been the starting place of every human space flight in NASA s history. It is where the first Americans left Earth during Project Mercury, the terrestrial departure point of the lunar-bound Apollo astronauts, as well as the last solid ground many astronauts step foot on before beginning their long stays aboard the International Space Station. It will also be the starting point for future NASA missions to the Moon and Mars and temporary host of the new Ares series rockets designed to take us there. Since the first days of the early NASA missions, in order to keep up with the demands of the intricate and critical Space Program, the launch complex - host to the large Vehicle Assembly Building, two launch pads, and myriad support facilities - has grown increasingly complex to accommodate the sophisticated technologies needed to manage today s space missions. To handle the complicated launch coordination safely, NASA found ways to automate mission-critical applications, resulting in streamlined decision-making. One of these methods, management software called the Control Monitor Unit (CMU), created in conjunction with McDonnell Douglas Space & Defense Systems, has since left NASA, and is finding its way into additional applications.
Compiled MPI: Cost-Effective Exascale Applications Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bronevetsky, G; Quinlan, D; Lumsdaine, A
2012-04-10
The complexity of petascale and exascale machines makes it increasingly difficult to develop applications that can take advantage of them. Future systems are expected to feature billion-way parallelism, complex heterogeneous compute nodes and poor availability of memory (Peter Kogge, 2008). This new challenge for application development is motivating a significant amount of research and development on new programming models and runtime systems designed to simplify large-scale application development. Unfortunately, DoE has significant multi-decadal investment in a large family of mission-critical scientific applications. Scaling these applications to exascale machines will require a significant investment that will dwarf the costs of hardwaremore » procurement. A key reason for the difficulty in transitioning today's applications to exascale hardware is their reliance on explicit programming techniques, such as the Message Passing Interface (MPI) programming model to enable parallelism. MPI provides a portable and high performance message-passing system that enables scalable performance on a wide variety of platforms. However, it also forces developers to lock the details of parallelization together with application logic, making it very difficult to adapt the application to significant changes in the underlying system. Further, MPI's explicit interface makes it difficult to separate the application's synchronization and communication structure, reducing the amount of support that can be provided by compiler and run-time tools. This is in contrast to the recent research on more implicit parallel programming models such as Chapel, OpenMP and OpenCL, which promise to provide significantly more flexibility at the cost of reimplementing significant portions of the application. We are developing CoMPI, a novel compiler-driven approach to enable existing MPI applications to scale to exascale systems with minimal modifications that can be made incrementally over the application's lifetime. It includes: (1) New set of source code annotations, inserted either manually or automatically, that will clarify the application's use of MPI to the compiler infrastructure, enabling greater accuracy where needed; (2) A compiler transformation framework that leverages these annotations to transform the original MPI source code to improve its performance and scalability; (3) Novel MPI runtime implementation techniques that will provide a rich set of functionality extensions to be used by applications that have been transformed by our compiler; and (4) A novel compiler analysis that leverages simple user annotations to automatically extract the application's communication structure and synthesize most complex code annotations.« less
Revised and extended UTILITIES for the RATIP package
NASA Astrophysics Data System (ADS)
Nikkinen, J.; Fritzsche, S.; Heinäsmäki, S.
2006-09-01
During the last years, the RATIP package has been found useful for calculating the excitation and decay properties of free atoms. Based on the (relativistic) multiconfiguration Dirac-Fock method, this program is used to obtain accurate predictions of atomic properties and to analyze many recent experiments. The daily work with this package made an extension of its UTILITIES [S. Fritzsche, Comput. Phys. Comm. 141 (2001) 163] desirable in order to facilitate the data handling and interpretation of complex spectra. For this purpose, we make available an enlarged version of the UTILITIES which mainly supports the comparison with experiment as well as large Auger computations. Altogether 13 additional tasks have been appended to the program together with a new menu structure to improve the interactive control of the program. Program summaryTitle of program: RATIP Catalogue identifier: ADPD_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADPD_v2_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions: none Reference in CPC to previous version: S. Fritzsche, Comput. Phys. Comm. 141 (2001) 163 Catalogue identifier of previous version: ADPD Authors of previous version: S. Fritzsche, Department of Physics, University of Kassel, Heinrich-Plett-Strasse 40, D-34132 Kassel, Germany Does the new version supersede the original program?: yes Computer for which the new version is designed and others on which it has been tested: IBM RS 6000, PC Pentium II-IV Installations: University of Kassel (Germany), University of Oulu (Finland) Operating systems: IBM AIX, Linux, Unix Program language used in the new version: ANSI standard Fortran 90/95 Memory required to execute with typical data: 300 kB No. of bits in a word: All real variables are parameterized by a selected kind parameter and, thus, can be adapted to any required precision if supported by the compiler. Currently, the kind parameter is set to double precision (two 32-bit words) as used also for other components of the RATIP package [S. Fritzsche, C.F. Fischer, C.Z. Dong, Comput. Phys. Comm. 124 (2000) 341; G. Gaigalas, S. Fritzsche, Comput. Phys. Comm. 134 (2001) 86; S. Fritzsche, Comput. Phys. Comm. 141 (2001) 163; S. Fritzsche, J. Elec. Spec. Rel. Phen. 114-116 (2001) 1155] No. of lines in distributed program, including test data, etc.:231 813 No. of bytes in distributed program, including test data, etc.: 3 977 387 Distribution format: tar.gzip file Nature of the physical problem: In order to describe atomic excitation and decay properties also quantitatively, large-scale computations are often needed. In the framework of the RATIP package, the UTILITIES support a variety of (small) tasks. For example, these tasks facilitate the file and data handling in large-scale applications or in the interpretation of complex spectra. Method of solution: The revised UTILITIES now support a total of 29 subtasks which are mainly concerned with the manipulation of output data as obtained from other components of the RATIP package. Each of these tasks are realized by one or several subprocedures which have access to the corresponding modules of the main components. While the main menu defines seven groups of subtasks for data manipulations and computations, a particular task is selected from one of these group menus. This allows to enlarge the program later if technical support for further tasks will become necessary. For each selected task, an interactive dialog about the required input and output data as well as a few additional information are printed during the execution of the program. Reasons for the new version: The requirement for enlarging the previous version of the UTILITIES [S. Fritzsche, Comput. Phys. Comm. 141 (2001) 163] arose from the recent application of the RATIP package for large-scale radiative and Auger computations. A number of new subtasks now refer to the handling of Auger amplitudes and their proper combination in order to facilitate the interpretation of complex spectra. A few further tasks, such as the direct access to the one-electron matrix elements for some given set of orbital functions, have been found useful also in the analysis of data. Summary of revisions: extraction and handling of atomic data within the framework of RATIP. With the revised version, we now 'add' another 13 tasks which refer to the manipulation of data files, the generation and interpretation of Auger spectra, the computation of various one- and two-electron matrix elements as well as the evaluation of momentum densities and grid parameters. Owing to the rather large number of subtasks, the main menu has been divided into seven groups from which the individual tasks can be selected very similarly as before. Typical running time: The program responds promptly for most of the tasks. The responding time for some tasks, such as the generation of a relativistic momentum density, strongly depends on the size of the corresponding data files and the number of grid points. Unusual features of the program: A total of 29 different tasks are supported by the program. Starting from the main menu, the user is guided interactively through the program by a dialog and a few additional explanations. For each task, a short summary about its function is displayed before the program prompts for all the required input data.
StructAlign, a Program for Alignment of Structures of DNA-Protein Complexes.
Popov, Ya V; Galitsyna, A A; Alexeevski, A V; Karyagina, A S; Spirin, S A
2015-11-01
Comparative analysis of structures of complexes of homologous proteins with DNA is important in the analysis of DNA-protein recognition. Alignment is a necessary stage of the analysis. An alignment is a matching of amino acid residues and nucleotides of one complex to residues and nucleotides of the other. Currently, there are no programs available for aligning structures of DNA-protein complexes. We present the program StructAlign, which should fill this gap. The program inputs a pair of complexes of DNA double helix with proteins and outputs an alignment of DNA chains corresponding to the best spatial fit of the protein chains.
CDC's Emergency Management Program activities - worldwide, 2003-2012.
2013-09-06
In 2003, recognizing the increasing frequency and complexity of disease outbreaks and disasters and a greater risk for terrorism, CDC established the Emergency Operations Center (EOC), bringing together CDC staff members who respond to public health emergencies to enhance communication and coordination. To complement the physical EOC environment, CDC implemented the Incident Management System (IMS), a staffing structure and set of standard operational protocols and services to support and monitor CDC program-led responses to complex public health emergencies. The EOC and IMS are key components of CDC's Emergency Management Program (EMP), which applies emergency management principles to public health practice. To enumerate activities conducted by the EMP during 2003-2012, CDC analyzed data from daily reports and activity logs. The results of this analysis determined that, during 2003-2012, the EMP fully activated the EOC and IMS on 55 occasions to support responses to infectious disease outbreaks, natural disasters, national security events (e.g., conventions, presidential addresses, and international summits), mass gatherings (e.g., large sports and social events), and man-made disasters. On 109 other occasions, the EMP was used to support emergency responses that did not require full EOC activation, and the EMP also conducted 30 exercises and drills. This report provides an overview of those 194 EMP activities.
Karaca, Ezgi; Melquiond, Adrien S J; de Vries, Sjoerd J; Kastritis, Panagiotis L; Bonvin, Alexandre M J J
2010-08-01
Over the last years, large scale proteomics studies have generated a wealth of information of biomolecular complexes. Adding the structural dimension to the resulting interactomes represents a major challenge that classical structural experimental methods alone will have difficulties to confront. To meet this challenge, complementary modeling techniques such as docking are thus needed. Among the current docking methods, HADDOCK (High Ambiguity-Driven DOCKing) distinguishes itself from others by the use of experimental and/or bioinformatics data to drive the modeling process and has shown a strong performance in the critical assessment of prediction of interactions (CAPRI), a blind experiment for the prediction of interactions. Although most docking programs are limited to binary complexes, HADDOCK can deal with multiple molecules (up to six), a capability that will be required to build large macromolecular assemblies. We present here a novel web interface of HADDOCK that allows the user to dock up to six biomolecules simultaneously. This interface allows the inclusion of a large variety of both experimental and/or bioinformatics data and supports several types of cyclic and dihedral symmetries in the docking of multibody assemblies. The server was tested on a benchmark of six cases, containing five symmetric homo-oligomeric protein complexes and one symmetric protein-DNA complex. Our results reveal that, in the presence of either bioinformatics and/or experimental data, HADDOCK shows an excellent performance: in all cases, HADDOCK was able to generate good to high quality solutions and ranked them at the top, demonstrating its ability to model symmetric multicomponent assemblies. Docking methods can thus play an important role in adding the structural dimension to interactomes. However, although the current docking methodologies were successful for a vast range of cases, considering the variety and complexity of macromolecular assemblies, inclusion of some kind of experimental information (e.g. from mass spectrometry, nuclear magnetic resonance, cryoelectron microscopy, etc.) will remain highly desirable to obtain reliable results.
Community-based approaches to address childhood undernutrition and obesity in developing countries.
Shetty, Prakash
2009-01-01
Community-based approaches have been the mainstay of interventions to address the problem of child malnutrition in developing societies. Many programs have been in operation in several countries for decades and originated largely as social welfare, food security and poverty eradication programs. Increasingly conceptual frameworks to guide this activity have been developed as our understanding of the complex nature of the determinants of undernutrition improves. Alongside this evolution, is the accumulation of evidence on the types of interventions in the community that are effective, practical and sustainable. The changing environment is probably determining the altering scenario of child nutrition in developing societies, with rapid developmental transition and urbanization being responsible for the emerging problems of obesity and other metabolic disorders that are largely the result of the now well-recognized linkages between child undernutrition and early onset adult chronic diseases. This dramatic change is contributing to the double burden of malnutrition in developing countries. Community interventions hence need to be integrated and joined up to reduce both aspects of malnutrition in societies. The evidence that community-based nutrition interventions can have a positive impact on pregnancy outcomes and child undernutrition needs to be evaluated to enable programs to prioritize and incorporate the interventions that work in the community. Programs that are operational and successful also need to be evaluated and disseminated in order to enable countries to generate their own programs tailored to tackling the changing nutritional problems of the children in their society. Copyright (c) 2009 S. Karger AG, Basel.
Building a computer-aided design capability using a standard time share operating system
NASA Technical Reports Server (NTRS)
Sobieszczanski, J.
1975-01-01
The paper describes how an integrated system of engineering computer programs can be built using a standard commercially available operating system. The discussion opens with an outline of the auxiliary functions that an operating system can perform for a team of engineers involved in a large and complex task. An example of a specific integrated system is provided to explain how the standard operating system features can be used to organize the programs into a simple and inexpensive but effective system. Applications to an aircraft structural design study are discussed to illustrate the use of an integrated system as a flexible and efficient engineering tool. The discussion concludes with an engineer's assessment of an operating system's capabilities and desirable improvements.
Optimal approach to quantum communication using dynamic programming.
Jiang, Liang; Taylor, Jacob M; Khaneja, Navin; Lukin, Mikhail D
2007-10-30
Reliable preparation of entanglement between distant systems is an outstanding problem in quantum information science and quantum communication. In practice, this has to be accomplished by noisy channels (such as optical fibers) that generally result in exponential attenuation of quantum signals at large distances. A special class of quantum error correction protocols, quantum repeater protocols, can be used to overcome such losses. In this work, we introduce a method for systematically optimizing existing protocols and developing more efficient protocols. Our approach makes use of a dynamic programming-based searching algorithm, the complexity of which scales only polynomially with the communication distance, letting us efficiently determine near-optimal solutions. We find significant improvements in both the speed and the final-state fidelity for preparing long-distance entangled states.
Strategic planning for disaster recovery with stochastic last mile distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bent, Russell Whitford; Van Hentenryck, Pascal; Coffrin, Carleton
2010-01-01
This paper considers the single commodity allocation problem (SCAP) for disaster recovery, a fundamental problem faced by all populated areas. SCAPs are complex stochastic optimization problems that combine resource allocation, warehouse routing, and parallel fleet routing. Moreover, these problems must be solved under tight runtime constraints to be practical in real-world disaster situations. This paper formalizes the specification of SCAPs and introduces a novel multi-stage hybrid-optimization algorithm that utilizes the strengths of mixed integer programming, constraint programming, and large neighborhood search. The algorithm was validated on hurricane disaster scenarios generated by Los Alamos National Laboratory using state-of-the-art disaster simulation toolsmore » and is deployed to aid federal organizations in the US.« less
The systems engineering overview and process (from the Systems Engineering Management Guide, 1990)
NASA Technical Reports Server (NTRS)
1993-01-01
The past several decades have seen the rise of large, highly interactive systems that are on the forward edge of technology. As a result of this growth and the increased usage of digital systems (computers and software), the concept of systems engineering has gained increasing attention. Some of this attention is no doubt due to large program failures which possibly could have been avoided, or at least mitigated, through the use of systems engineering principles. The complexity of modern day weapon systems requires conscious application of systems engineering concepts to ensure producible, operable and supportable systems that satisfy mission requirements. Although many authors have traced the roots of systems engineering to earlier dates, the initial formalization of the systems engineering process for military development began to surface in the mid-1950s on the ballistic missile programs. These early ballistic missile development programs marked the emergence of engineering discipline 'specialists' which has since continued to grow. Each of these specialties not only has a need to take data from the overall development process, but also to supply data, in the form of requirements and analysis results, to the process. A number of technical instructions, military standards and specifications, and manuals were developed as a result of these development programs. In particular, MILSTD-499 was issued in 1969 to assist both government and contractor personnel in defining the systems engineering effort in support of defense acquisition programs. This standard was updated to MIL-STD499A in 1974, and formed the foundation for current application of systems engineering principles to military development programs.
The systems engineering overview and process (from the Systems Engineering Management Guide, 1990)
NASA Astrophysics Data System (ADS)
The past several decades have seen the rise of large, highly interactive systems that are on the forward edge of technology. As a result of this growth and the increased usage of digital systems (computers and software), the concept of systems engineering has gained increasing attention. Some of this attention is no doubt due to large program failures which possibly could have been avoided, or at least mitigated, through the use of systems engineering principles. The complexity of modern day weapon systems requires conscious application of systems engineering concepts to ensure producible, operable and supportable systems that satisfy mission requirements. Although many authors have traced the roots of systems engineering to earlier dates, the initial formalization of the systems engineering process for military development began to surface in the mid-1950s on the ballistic missile programs. These early ballistic missile development programs marked the emergence of engineering discipline 'specialists' which has since continued to grow. Each of these specialties not only has a need to take data from the overall development process, but also to supply data, in the form of requirements and analysis results, to the process. A number of technical instructions, military standards and specifications, and manuals were developed as a result of these development programs. In particular, MILSTD-499 was issued in 1969 to assist both government and contractor personnel in defining the systems engineering effort in support of defense acquisition programs. This standard was updated to MIL-STD499A in 1974, and formed the foundation for current application of systems engineering principles to military development programs.
Birnbaum, Shira; Sperber-Weiss, Doreen; Dimitrios, Timothy; Eckel, Donald; Monroy-Miller, Cherry; Monroe, Janet J; Friedman, Ross; Ologbosele, Mathias; Epo, Grace; Sharpe, Debra; Zarski, Yongsuk
A large state psychiatric hospital experienced a state-mandated Reduction in Force that resulted in the abrupt loss and rapid turnover of more than 40% of its nursing and paraprofessional staff. The change exemplified current national trends toward downsizing and facility closure. This article describes revisions to the nursing orientation program that supported cost containment and fidelity to mission and clinical practices during the transition. An existing nursing orientation program was reconfigured in alignment with principles of rational instructional design and a core-competencies model of curriculum development, evidence-based practices that provided tactical clarity and commonality of purpose during a complex and emotionally charged transition period. Program redesign enabled efficiencies that facilitated the transition, with no evidence of associated negative effects. The process described here offers an example for hospitals facing similar workforce reorganization in an era of public sector downsizing.
Evaluating and extending user-level fault tolerance in MPI applications
Laguna, Ignacio; Richards, David F.; Gamblin, Todd; ...
2016-01-11
The user-level failure mitigation (ULFM) interface has been proposed to provide fault-tolerant semantics in the Message Passing Interface (MPI). Previous work presented performance evaluations of ULFM; yet questions related to its programability and applicability, especially to non-trivial, bulk synchronous applications, remain unanswered. In this article, we present our experiences on using ULFM in a case study with a large, highly scalable, bulk synchronous molecular dynamics application to shed light on the advantages and difficulties of this interface to program fault-tolerant MPI applications. We found that, although ULFM is suitable for master–worker applications, it provides few benefits for more common bulkmore » synchronous MPI applications. Furthermore, to address these limitations, we introduce a new, simpler fault-tolerant interface for complex, bulk synchronous MPI programs with better applicability and support than ULFM for application-level recovery mechanisms, such as global rollback.« less
NASA Technical Reports Server (NTRS)
Smith, Mark S.; Bui, Trong T.; Garcia, Christian A.; Cumming, Stephen B.
2016-01-01
A pair of compliant trailing edge flaps was flown on a modified GIII airplane. Prior to flight test, multiple analysis tools of various levels of complexity were used to predict the aerodynamic effects of the flaps. Vortex lattice, full potential flow, and full Navier-Stokes aerodynamic analysis software programs were used for prediction, in addition to another program that used empirical data. After the flight-test series, lift and pitching moment coefficient increments due to the flaps were estimated from flight data and compared to the results of the predictive tools. The predicted lift increments matched flight data well for all predictive tools for small flap deflections. All tools over-predicted lift increments for large flap deflections. The potential flow and Navier-Stokes programs predicted pitching moment coefficient increments better than the other tools.
Monte Carlo simulation of biomolecular systems with BIOMCSIM
NASA Astrophysics Data System (ADS)
Kamberaj, H.; Helms, V.
2001-12-01
A new Monte Carlo simulation program, BIOMCSIM, is presented that has been developed in particular to simulate the behaviour of biomolecular systems, leading to insights and understanding of their functions. The computational complexity in Monte Carlo simulations of high density systems, with large molecules like proteins immersed in a solvent medium, or when simulating the dynamics of water molecules in a protein cavity, is enormous. The program presented in this paper seeks to provide these desirable features putting special emphasis on simulations in grand canonical ensembles. It uses different biasing techniques to increase the convergence of simulations, and periodic load balancing in its parallel version, to maximally utilize the available computer power. In periodic systems, the long-ranged electrostatic interactions can be treated by Ewald summation. The program is modularly organized, and implemented using an ANSI C dialect, so as to enhance its modifiability. Its performance is demonstrated in benchmark applications for the proteins BPTI and Cytochrome c Oxidase.
A boundedness result for the direct heuristic dynamic programming.
Liu, Feng; Sun, Jian; Si, Jennie; Guo, Wentao; Mei, Shengwei
2012-08-01
Approximate/adaptive dynamic programming (ADP) has been studied extensively in recent years for its potential scalability to solve large state and control space problems, including those involving continuous states and continuous controls. The applicability of ADP algorithms, especially the adaptive critic designs has been demonstrated in several case studies. Direct heuristic dynamic programming (direct HDP) is one of the ADP algorithms inspired by the adaptive critic designs. It has been shown applicable to industrial scale, realistic and complex control problems. In this paper, we provide a uniformly ultimately boundedness (UUB) result for the direct HDP learning controller under mild and intuitive conditions. By using a Lyapunov approach we show that the estimation errors of the learning parameters or the weights in the action and critic networks remain UUB. This result provides a useful controller convergence guarantee for the first time for the direct HDP design. Copyright © 2012 Elsevier Ltd. All rights reserved.
National Immunization Program: Computerized System as a tool for new challenges
Sato, Ana Paula Sayuri
2015-01-01
The scope and coverage of the Brazilian Immunization Program can be compared with those in developed countries because it provides a large number of vaccines and has a considerable coverage. The increasing complexity of the program brings challenges regarding its development, high coverage levels, access equality, and safety. The Immunization Information System, with nominal data, is an innovative tool that can more accurately monitor these indicators and allows the evaluation of the impact of new vaccination strategies. The main difficulties for such a system are in its implementation process, training of professionals, mastering its use, its constant maintenance needs and ensuring the information contained remain confidential. Therefore, encouraging the development of this tool should be part of public health policies and should also be involved in the three spheres of government as well as the public and private vaccination services. PMID:26176746
Comprehensive rotorcraft analysis methods
NASA Technical Reports Server (NTRS)
Stephens, Wendell B.; Austin, Edward E.
1988-01-01
The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).
Self-paced exercise program for office workers: impact on productivity and health outcomes.
Low, David; Gramlich, Martha; Engram, Barbara Wright
2007-03-01
The impact of a self-paced exercise program on productivity and health outcomes of 32 adult workers in a large federal office complex was investigated during 3 months. Walking was the sole form of exercise. The first month, during which no walking occurred, was the control period. The second and third months were the experimental period. Participants were divided into three levels based on initial weight and self-determined walking distance goals. Productivity (using the Endicott Work Productivity Scale), walking distance (using a pedometer), and health outcomes (blood pressure, weight, pulse rate, and body fat percentage) were measured weekly. Results from this study, based on a paired t test analysis, suggest that although the self-paced exercise program had no impact on productivity, it lowered blood pressure and promoted weight loss. Further study using a larger sample and a controlled experimental design is recommended to provide conclusive evidence.
Space Environments and Effects (SEE) Program: Spacecraft Charging Technology Development Activities
NASA Technical Reports Server (NTRS)
Kauffman, Billy; Hardage, Donna; Minor, Jody
2003-01-01
Reducing size and weight of spacecraft, along with demanding increased performance capabilities, introduces many uncertainties in the engineering design community on how materials and spacecraft systems will perform in space. The engineering design community is forever behind on obtaining and developing new tools and guidelines to mitigate the harmful effects of the space environment. Adding to this complexity is the continued push to use Commercial-off-the-shelf (COTS) microelectronics, potential usage of unproven technologies such as large solar sail structures and nuclear electric propulsion. In order to drive down these uncertainties, various programs are working together to avoid duplication, save what resources are available in this technical area and possess a focused agenda to insert these new developments into future mission designs. This paper will introduce the SEE Program, briefly discuss past and currently sponsored spacecraft charging activities and possible future endeavors.
Space Environments and Effects (SEE) Program: Spacecraft Charging Technology Development Activities
NASA Technical Reports Server (NTRS)
Kauffman, B.; Hardage, D.; Minor, J.
2004-01-01
Reducing size and weight of spacecraft, along with demanding increased performance capabilities, introduces many uncertainties in the engineering design community on how materials and spacecraft systems will perform in space. The engineering design community is forever behind on obtaining and developing new tools and guidelines to mitigate the harmful effects of the space environment. Adding to this complexity is the continued push to use Commercial-off-the-Shelf (COTS) microelectronics, potential usage of unproven technologies such as large solar sail structures and nuclear electric propulsion. In order to drive down these uncertainties, various programs are working together to avoid duplication, save what resources are available in this technical area and possess a focused agenda to insert these new developments into future mission designs. This paper will introduce the SEE Program, briefly discuss past and currently sponsored spacecraft charging activities and possible future endeavors.
Non-additive simple potentials for pre-programmed self-assembly
NASA Astrophysics Data System (ADS)
Mendoza, Carlos
2015-03-01
A major goal in nanoscience and nanotechnology is the self-assembly of any desired complex structure with a system of particles interacting through simple potentials. To achieve this objective, intense experimental and theoretical efforts are currently concentrated in the development of the so called ``patchy'' particles. Here we follow a completely different approach and introduce a very accessible model to produce a large variety of pre-programmed two-dimensional (2D) complex structures. Our model consists of a binary mixture of particles that interact through isotropic interactions that is able to self-assemble into targeted lattices by the appropriate choice of a small number of geometrical parameters and interaction strengths. We study the system using Monte Carlo computer simulations and, despite its simplicity, we are able to self assemble potentially useful structures such as chains, stripes, Kagomé, twisted Kagomé, honeycomb, square, Archimedean and quasicrystalline tilings. Our model is designed such that it may be implemented using discotic particles or, alternatively, using exclusively spherical particles interacting isotropically. Thus, it represents a promising strategy for bottom-up nano-fabrication. Partial Financial Support: DGAPA IN-110613.
A Monte Carlo model for the gardening of the lunar regolith
NASA Technical Reports Server (NTRS)
Arnold, J. R.
1975-01-01
The processes of movement and turnover of the lunar regolith are described by a Monte Carlo model. The movement of material by the direct cratering process is the dominant mode, but slumping is also included for angles exceeding the static angle of repose. Using a group of interrelated computer programs, a large number of properties are calculated, including topography, formation of layers, depth of the disturbed layer, nuclear-track distributions, and cosmogenic nuclides. In the most complex program, the history of a 36-point square array is followed for times up to 400 million years. The histories generated are complex and exhibit great variety. Because a crater covers much less area than its ejecta blanket, there is a tendency for the height change at a test point to exhibit periods of slow accumulation followed by sudden excavation. In general, the agreement with experiment and observation seems good, but two areas of disagreement stand out. First, the calculated surface is rougher than that observed. Second, the observed bombardment ages, of the order 400 million are shorter than expected (by perhaps a factor of 5).
The force on the flex: Global parallelism and portability
NASA Technical Reports Server (NTRS)
Jordan, H. F.
1986-01-01
A parallel programming methodology, called the force, supports the construction of programs to be executed in parallel by an unspecified, but potentially large, number of processes. The methodology was originally developed on a pipelined, shared memory multiprocessor, the Denelcor HEP, and embodies the primitive operations of the force in a set of macros which expand into multiprocessor Fortran code. A small set of primitives is sufficient to write large parallel programs, and the system has been used to produce 10,000 line programs in computational fluid dynamics. The level of complexity of the force primitives is intermediate. It is high enough to mask detailed architectural differences between multiprocessors but low enough to give the user control over performance. The system is being ported to a medium scale multiprocessor, the Flex/32, which is a 20 processor system with a mixture of shared and local memory. Memory organization and the type of processor synchronization supported by the hardware on the two machines lead to some differences in efficient implementations of the force primitives, but the user interface remains the same. An initial implementation was done by retargeting the macros to Flexible Computer Corporation's ConCurrent C language. Subsequently, the macros were caused to directly produce the system calls which form the basis for ConCurrent C. The implementation of the Fortran based system is in step with Flexible Computer Corporations's implementation of a Fortran system in the parallel environment.
Targetting and guidance program documentation. [a user's manual
NASA Technical Reports Server (NTRS)
Harrold, E. F.; Neyhard, J. F.
1974-01-01
A FORTRAN computer program was developed which automatically targets two and three burn rendezvous missions and performs feedback guidance using the GUIDE algorithm. The program was designed to accept a large class of orbit specifications and to automatically choose a two or three burn mission depending upon the time alignment of the vehicle and target. The orbits may be specified as any combination of circular and elliptical orbits and may be coplanar or inclined, but must be aligned coaxially with their perigees in the same direction. The program accomplishes the required targeting by repeatedly converging successively more complex missions. It solves the coplanar impulsive version of the mission, then the finite burn coplanar mission, and finally, the full plane change mission. The GUIDE algorithm is exercised in a feedback guidance mode by taking the targeted solution and moving the vehicle state step by step ahead in time, adding acceleration and navigational errors, and reconverging from the perturbed states at fixed guidance update intervals. A program overview is presented, along with a user's guide which details input, output, and the various subroutines.
Integrated Digital Flight Control System for the Space Shuttle Orbiter
NASA Technical Reports Server (NTRS)
1973-01-01
The objectives of the integrated digital flight control system (DFCS) is to provide rotational and translational control of the space shuttle orbiter in all phases of flight: from launch ascent through orbit to entry and touchdown, and during powered horizontal flights. The program provides a versatile control system structure while maintaining uniform communications with other programs, sensors, and control effectors by using an executive routine/functional subroutine format. The program reads all external variables at a single point, copies them into its dedicated storage, and then calls the required subroutines in the proper sequence. As a result, the flight control program is largely independent of other programs in the computer complex and is equally insensitive to characteristics of the processor configuration. The integrated structure is described of the control system and the DFCS executive routine which embodies that structure. The input and output, including jet selection are included. Specific estimation and control algorithm are shown for the various mission phases: cruise (including horizontal powered flight), entry, on-orbit, and boost. Attitude maneuver routines that interface with the DFCS are included.
Astrochemical evolution along star formation: Overview of the IRAM Large Program ASAI
NASA Astrophysics Data System (ADS)
Lefloch, Bertrand; Bachiller, R.; Ceccarelli, C.; Cernicharo, J.; Codella, C.; Fuente, A.; Kahane, C.; López-Sepulcre, A.; Tafalla, M.; Vastel, C.; Caux, E.; González-García, M.; Bianchi, E.; Gómez-Ruiz, A.; Holdship, J.; Mendoza, E.; Ospina-Zamudio, J.; Podio, L.; Quénard, D.; Roueff, E.; Sakai, N.; Viti, S.; Yamamoto, S.; Yoshida, K.; Favre, C.; Monfredini, T.; Quitián-Lara, H. M.; Marcelino, N.; Roberty, H. Boechat; Cabrit, S.
2018-04-01
Evidence is mounting that the small bodies of our Solar System, such as comets and asteroids, have at least partially inherited their chemical composition from the first phases of the Solar System formation. It then appears that the molecular complexity of these small bodies is most likely related to the earliest stages of star formation. It is therefore important to characterize and to understand how the chemical evolution changes with solar-type protostellar evolution. We present here the Large Program "Astrochemical Surveys At IRAM" (ASAI). Its goal is to carry out unbiased millimeter line surveys between 80 and 272 GHz of a sample of ten template sources, which fully cover the first stages of the formation process of solar-type stars, from prestellar cores to the late protostellar phase. In this article, we present an overview of the surveys and results obtained from the analysis of the 3 mm band observations. The number of detected main isotopic species barely varies with the evolutionary stage and is found to be very similar to that of massive star-forming regions. The molecular content in O- and C- bearing species allows us to define two chemical classes of envelopes, whose composition is dominated by either a) a rich content in O-rich complex organic molecules, associated with hot corino sources, or b) a rich content in hydrocarbons, typical of Warm Carbon Chain Chemistry sources. Overall, a high chemical richness is found to be present already in the initial phases of solar-type star formation.
NASA Astrophysics Data System (ADS)
Gusev, Anatoly; Diansky, Nikolay; Zalesny, Vladimir
2010-05-01
The original program complex is proposed for the ocean circulation sigma-model, developed in the Institute of Numerical Mathematics (INM), Russian Academy of Sciences (RAS). The complex can be used in various curvilinear orthogonal coordinate systems. In addition to ocean circulation model, the complex contains a sea ice dynamics and thermodynamics model, as well as the original system of the atmospheric forcing implementation on the basis of both prescribed meteodata and atmospheric model results. This complex can be used as the oceanic block of Earth climate model as well as for solving the scientific and practical problems concerning the World ocean and its separate oceans and seas. The developed program complex can be effectively used on parallel shared memory computational systems and on contemporary personal computers. On the base of the complex proposed the ocean general circulation model (OGCM) was developed. The model is realized in the curvilinear orthogonal coordinate system obtained by the conformal transformation of the standard geographical grid that allowed us to locate the system singularities outside the integration domain. The horizontal resolution of the OGCM is 1 degree on longitude, 0.5 degree on latitude, and it has 40 non-uniform sigma-levels in depth. The model was integrated for 100 years starting from the Levitus January climatology using the realistic atmospheric annual cycle calculated on the base of CORE datasets. The experimental results showed us that the model adequately reproduces the basic characteristics of large-scale World Ocean dynamics, that is in good agreement with both observational data and results of the best climatic OGCMs. This OGCM is used as the oceanic component of the new version of climatic system model (CSM) developed in INM RAS. The latter is now ready for carrying out the new numerical experiments on climate and its change modelling according to IPCC (Intergovernmental Panel on Climate Change) scenarios in the scope of the CMIP-5 (Coupled Model Intercomparison Project). On the base of the complex proposed the Pacific Ocean circulation eddy-resolving model was realized. The integration domain covers the Pacific from Equator to Bering Strait. The model horizontal resolution is 0.125 degree and it has 20 non-uniform sigma-levels in depth. The model adequately reproduces circulation large-scale structure and its variability: Kuroshio meandering, ocean synoptic eddies, frontal zones, etc. Kuroshio high variability is shown. The distribution of contaminant was simulated that is admittedly wasted near Petropavlovsk-Kamchatsky. The results demonstrate contaminant distribution structure and provide us understanding of hydrological fields formation processes in the North-West Pacific.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Katherine J; Johnson, Seth R; Prokopenko, Andrey V
'ForTrilinos' is related to The Trilinos Project, which contains a large and growing collection of solver capabilities that can utilize next-generation platforms, in particular scalable multicore, manycore, accelerator and heterogeneous systems. Trilinos is primarily written in C++, including its user interfaces. While C++ is advantageous for gaining access to the latest programming environments, it limits Trilinos usage via Fortran. Sever ad hoc translation interfaces exist to enable Fortran usage of Trilinos, but none of these interfaces is general-purpose or written for reusable and sustainable external use. 'ForTrilinos' provides a seamless pathway for large and complex Fortran-based codes to access Trilinosmore » without C/C++ interface code. This access includes Fortran versions of Kokkos abstractions for code execution and data management.« less
ReOpt[trademark] V2.0 user guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, M K; Bryant, J L
1992-10-01
Cleaning up the large number of contaminated waste sites at Department of Energy (DOE) facilities in the US presents a large and complex problem. Each waste site poses a singular set of circumstances (different contaminants, environmental concerns, and regulations) that affect selection of an appropriate response. Pacific Northwest Laboratory (PNL) developed ReOpt to provide information about the remedial action technologies that are currently available. It is an easy-to-use personal computer program and database that contains data about these remedial technologies and auxiliary data about contaminants and regulations. ReOpt will enable engineers and planners involved in environmental restoration efforts to quicklymore » identify potentially applicable environmental restoration technologies and access corresponding information required to select cleanup activities for DOE sites.« less
A tropical horde of counterfeit predator eyes.
Janzen, Daniel H; Hallwachs, Winnie; Burns, John M
2010-06-29
We propose that the many different, but essentially similar, eye-like and face-like color patterns displayed by hundreds of species of tropical caterpillars and pupae-26 examples of which are displayed here from the dry, cloud, and rain forests of Area de Conservacion Guanacaste (ACG) in northwestern Costa Rica-constitute a huge and pervasive mimicry complex that is evolutionarily generated and sustained by the survival behavior of a large and multispecific array of potential predators: the insect-eating birds. We propose that these predators are variously and innately programmed to flee when abruptly confronted, at close range, with what appears to be an eye of one of their predators. Such a mimetic complex differs from various classical Batesian and Müllerian mimicry complexes of adult butterflies in that (i) the predators sustain it for the most part by innate traits rather than by avoidance behavior learned through disagreeable experiences, (ii) the more or less harmless, sessile, and largely edible mimics vastly outnumber the models, and (iii) there is no particular selection for the eye-like color pattern to closely mimic the eye or face of any particular predator of the insect-eating birds or that of any other member of this mimicry complex. Indeed, selection may not favor exact resemblance among these mimics at all. Such convergence through selection could create a superabundance of one particular false eyespot or face pattern, thereby increasing the likelihood of a bird species or guild learning to associate that pattern with harmless prey.
Russo, Philip L; Havers, Sally M; Cheng, Allen C; Richards, Michael; Graves, Nicholas; Hall, Lisa
2016-12-01
There are many well-established national health care-associated infection surveillance programs (HAISPs). Although validation studies have described data quality, there is little research describing important characteristics of large HAISPs. The aim of this study was to broaden our understanding and identify key characteristics of large HAISPs. Semi-structured interviews were conducted with purposively selected leaders from national and state-based HAISPs. Interview data were analyzed following an interpretive description process. Seven semi-structured interviews were conducted over a 6-month period during 2014-2015. Analysis of the data generated 5 distinct characteristics of large HAISPs: (1) triggers: surveillance was initiated by government or a cooperative of like-minded people, (2) purpose: a clear purpose is needed and determines other surveillance mechanisms, (3) data measures: consistency is more important than accuracy, (4) processes: a balance exists between the volume of data collected and resources, and (5) implementation and maintenance: a central coordinating body is crucial for uniformity and support. National HAISPs are complex and affect a broad range of stakeholders. Although the overall goal of health care-associated infection surveillance is to reduce the incidence of health care-associated infection, there are many crucial factors to be considered in attaining this goal. The findings from this study will assist the development of new HAISPs and could be used as an adjunct to evaluate existing programs. Copyright © 2016 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Molecular Marker Systems for Oenothera Genetics
Rauwolf, Uwe; Golczyk, Hieronim; Meurer, Jörg; Herrmann, Reinhold G.; Greiner, Stephan
2008-01-01
The genus Oenothera has an outstanding scientific tradition. It has been a model for studying aspects of chromosome evolution and speciation, including the impact of plastid nuclear co-evolution. A large collection of strains analyzed during a century of experimental work and unique genetic possibilities allow the exchange of genetically definable plastids, individual or multiple chromosomes, and/or entire haploid genomes (Renner complexes) between species. However, molecular genetic approaches for the genus are largely lacking. In this study, we describe the development of efficient PCR-based marker systems for both the nuclear genome and the plastome. They allow distinguishing individual chromosomes, Renner complexes, plastomes, and subplastomes. We demonstrate their application by monitoring interspecific exchanges of genomes, chromosome pairs, and/or plastids during crossing programs, e.g., to produce plastome–genome incompatible hybrids. Using an appropriate partial permanent translocation heterozygous hybrid, linkage group 7 of the molecular map could be assigned to chromosome 9·8 of the classical Oenothera map. Finally, we provide the first direct molecular evidence that homologous recombination and free segregation of chromosomes in permanent translocation heterozygous strains is suppressed. PMID:18791241
Molecular marker systems for Oenothera genetics.
Rauwolf, Uwe; Golczyk, Hieronim; Meurer, Jörg; Herrmann, Reinhold G; Greiner, Stephan
2008-11-01
The genus Oenothera has an outstanding scientific tradition. It has been a model for studying aspects of chromosome evolution and speciation, including the impact of plastid nuclear co-evolution. A large collection of strains analyzed during a century of experimental work and unique genetic possibilities allow the exchange of genetically definable plastids, individual or multiple chromosomes, and/or entire haploid genomes (Renner complexes) between species. However, molecular genetic approaches for the genus are largely lacking. In this study, we describe the development of efficient PCR-based marker systems for both the nuclear genome and the plastome. They allow distinguishing individual chromosomes, Renner complexes, plastomes, and subplastomes. We demonstrate their application by monitoring interspecific exchanges of genomes, chromosome pairs, and/or plastids during crossing programs, e.g., to produce plastome-genome incompatible hybrids. Using an appropriate partial permanent translocation heterozygous hybrid, linkage group 7 of the molecular map could be assigned to chromosome 9.8 of the classical Oenothera map. Finally, we provide the first direct molecular evidence that homologous recombination and free segregation of chromosomes in permanent translocation heterozygous strains is suppressed.
NASA Astrophysics Data System (ADS)
Fisher, J. Richard; Bradley, Richard F.; Brisken, Walter F.; Cotton, William D.; Emerson, Darrel T.; Kerr, Anthony R.; Lacasse, Richard J.; Morgan, Matthew A.; Napier, Peter J.; Norrod, Roger D.; Payne, John M.; Pospieszalski, Marian W.; Symmes, Arthur; Thompson, A. Richard; Webber, John C.
2009-03-01
This white paper offers cautionary observations about the planning and development of new, large radio astronomy instruments. Complexity is a strong cost driver so every effort should be made to assign differing science requirements to different instruments and probably different sites. The appeal of shared resources is generally not realized in practice and can often be counterproductive. Instrument optimization is much more difficult with longer lists of requirements, and the development process is longer and less efficient. More complex instruments are necessarily further behind the technology state of the art because of longer development times. Including technology R&D in the construction phase of projects is a growing trend that leads to higher risks, cost overruns, schedule delays, and project de-scoping. There are no technology breakthroughs just over the horizon that will suddenly bring down the cost of collecting area. Advances come largely through careful attention to detail in the adoption of new technology provided by industry and the commercial market. Radio astronomy instrumentation has a very bright future, but a vigorous long-term R&D program not tied directly to specific projects needs to be restored, fostered, and preserved.
Multiplexed Predictive Control of a Large Commercial Turbofan Engine
NASA Technical Reports Server (NTRS)
Richter, hanz; Singaraju, Anil; Litt, Jonathan S.
2008-01-01
Model predictive control is a strategy well-suited to handle the highly complex, nonlinear, uncertain, and constrained dynamics involved in aircraft engine control problems. However, it has thus far been infeasible to implement model predictive control in engine control applications, because of the combination of model complexity and the time allotted for the control update calculation. In this paper, a multiplexed implementation is proposed that dramatically reduces the computational burden of the quadratic programming optimization that must be solved online as part of the model-predictive-control algorithm. Actuator updates are calculated sequentially and cyclically in a multiplexed implementation, as opposed to the simultaneous optimization taking place in conventional model predictive control. Theoretical aspects are discussed based on a nominal model, and actual computational savings are demonstrated using a realistic commercial engine model.
Proposal for constructing an advanced software tool for planetary atmospheric modeling
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.
1990-01-01
Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.
Large-deformation modal coordinates for nonrigid vehicle dynamics
NASA Technical Reports Server (NTRS)
Likins, P. W.; Fleischer, G. E.
1972-01-01
The derivation of minimum-dimension sets of discrete-coordinate and hybrid-coordinate equations of motion of a system consisting of an arbitrary number of hinge-connected rigid bodies assembled in tree topology is presented. These equations are useful for the simulation of dynamical systems that can be idealized as tree-like arrangements of substructures, with each substructure consisting of either a rigid body or a collection of elastically interconnected rigid bodies restricted to small relative rotations at each connection. Thus, some of the substructures represent elastic bodies subjected to small strains or local deformations, but possibly large gross deformations, in the hybrid formulation, distributed coordinates referred to herein as large-deformation modal coordinates, are used for the deformations of these substructures. The equations are in a form suitable for incorporation into one or more computer programs to be used as multipurpose tools in the simulation of spacecraft and other complex electromechanical systems.
Stochastic dynamics of genetic broadcasting networks
NASA Astrophysics Data System (ADS)
Potoyan, Davit; Wolynes, Peter
The complex genetic programs of eukaryotic cells are often regulated by key transcription factors occupying or clearing out of a large number of genomic locations. Orchestrating the residence times of these factors is therefore important for the well organized functioning of a large network. The classic models of genetic switches sidestep this timing issue by assuming the binding of transcription factors to be governed entirely by thermodynamic protein-DNA affinities. Here we show that relying on passive thermodynamics and random release times can lead to a ''time-scale crisis'' of master genes that broadcast their signals to large number of binding sites. We demonstrate that this ''time-scale crisis'' can be resolved by actively regulating residence times through molecular stripping. We illustrate these ideas by studying the stochastic dynamics of the genetic network of the central eukaryotic master regulator NFκB which broadcasts its signals to many downstream genes that regulate immune response, apoptosis etc.
Stochastic dynamics of genetic broadcasting networks
NASA Astrophysics Data System (ADS)
Potoyan, Davit A.; Wolynes, Peter G.
2017-11-01
The complex genetic programs of eukaryotic cells are often regulated by key transcription factors occupying or clearing out of a large number of genomic locations. Orchestrating the residence times of these factors is therefore important for the well organized functioning of a large network. The classic models of genetic switches sidestep this timing issue by assuming the binding of transcription factors to be governed entirely by thermodynamic protein-DNA affinities. Here we show that relying on passive thermodynamics and random release times can lead to a "time-scale crisis" for master genes that broadcast their signals to a large number of binding sites. We demonstrate that this time-scale crisis for clearance in a large broadcasting network can be resolved by actively regulating residence times through molecular stripping. We illustrate these ideas by studying a model of the stochastic dynamics of the genetic network of the central eukaryotic master regulator NFκ B which broadcasts its signals to many downstream genes that regulate immune response, apoptosis, etc.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-10
...] Medicare Program; Section 3113: The Treatment of Certain Complex Diagnostic Laboratory Tests Demonstration... code under the Treatment of Certain Complex Diagnostic Laboratory Tests Demonstration. The deadline for... interested parties of an opportunity to participate in the Treatment of Certain Complex Diagnostic Laboratory...
How Does The Universe Work? The Physics Of The Cosmos Program (PCOS)
NASA Astrophysics Data System (ADS)
Sambruna, Rita M.
2011-09-01
The Physics of the Cosmos (PCOS) program incorporates cosmology, high-energy astrophysics, and fundamental physics projects aimed at addressing central questions about the nature of complex astrophysical phenomena such as black holes, neutron stars, dark energy, and gravitational waves. Its overarching theme is, How does the Universe work? PCOS includes a suite of operating (Chandra, Fermi, Planck, XMM-Newton, INTEGRAL) and future missions across the electromagnetic spectrum and beyond, which are in concept development and/or formulation. The PCOS program directly supports development of intermediate TRL (4-6) technology relevant to future missions through the Strategic Astrophysics Technology (SAT) program, as well as data analysis, theory, and experimental astrophysics via other R&A avenues (e.g., ADAP, ATP). The Einstein Fellowship is a vital and vibrant PCOS component funded by the program. PCOS receives community input via its Program Analysis Group, the PhysPAG (www.pcos.gsfc.nasa.gov/physpag.php), whose membership and meetings are open to the community at large. In this poster, we describe the detailed science questions addressed within PCOS, with special emphasis on future opportunities. Details about the PhysPAG operations and functions will be provided, as well as an update on future meetings.
Array data extractor (ADE): a LabVIEW program to extract and merge gene array data
2013-01-01
Background Large data sets from gene expression array studies are publicly available offering information highly valuable for research across many disciplines ranging from fundamental to clinical research. Highly advanced bioinformatics tools have been made available to researchers, but a demand for user-friendly software allowing researchers to quickly extract expression information for multiple genes from multiple studies persists. Findings Here, we present a user-friendly LabVIEW program to automatically extract gene expression data for a list of genes from multiple normalized microarray datasets. Functionality was tested for 288 class A G protein-coupled receptors (GPCRs) and expression data from 12 studies comparing normal and diseased human hearts. Results confirmed known regulation of a beta 1 adrenergic receptor and further indicate novel research targets. Conclusions Although existing software allows for complex data analyses, the LabVIEW based program presented here, “Array Data Extractor (ADE)”, provides users with a tool to retrieve meaningful information from multiple normalized gene expression datasets in a fast and easy way. Further, the graphical programming language used in LabVIEW allows applying changes to the program without the need of advanced programming knowledge. PMID:24289243
From Petascale to Exascale: Eight Focus Areas of R&D Challenges for HPC Simulation Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Springmeyer, R; Still, C; Schulz, M
2011-03-17
Programming models bridge the gap between the underlying hardware architecture and the supporting layers of software available to applications. Programming models are different from both programming languages and application programming interfaces (APIs). Specifically, a programming model is an abstraction of the underlying computer system that allows for the expression of both algorithms and data structures. In comparison, languages and APIs provide implementations of these abstractions and allow the algorithms and data structures to be put into practice - a programming model exists independently of the choice of both the programming language and the supporting APIs. Programming models are typically focusedmore » on achieving increased developer productivity, performance, and portability to other system designs. The rapidly changing nature of processor architectures and the complexity of designing an exascale platform provide significant challenges for these goals. Several other factors are likely to impact the design of future programming models. In particular, the representation and management of increasing levels of parallelism, concurrency and memory hierarchies, combined with the ability to maintain a progressive level of interoperability with today's applications are of significant concern. Overall the design of a programming model is inherently tied not only to the underlying hardware architecture, but also to the requirements of applications and libraries including data analysis, visualization, and uncertainty quantification. Furthermore, the successful implementation of a programming model is dependent on exposed features of the runtime software layers and features of the operating system. Successful use of a programming model also requires effective presentation to the software developer within the context of traditional and new software development tools. Consideration must also be given to the impact of programming models on both languages and the associated compiler infrastructure. Exascale programming models must reflect several, often competing, design goals. These design goals include desirable features such as abstraction and separation of concerns. However, some aspects are unique to large-scale computing. For example, interoperability and composability with existing implementations will prove critical. In particular, performance is the essential underlying goal for large-scale systems. A key evaluation metric for exascale models will be the extent to which they support these goals rather than merely enable them.« less
Deep Unsupervised Learning on a Desktop PC: A Primer for Cognitive Scientists.
Testolin, Alberto; Stoianov, Ivilin; De Filippo De Grazia, Michele; Zorzi, Marco
2013-01-01
Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programing parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low cost graphic cards (graphic processor units) without any specific programing effort, thanks to the use of high-level programming routines (available in MATLAB or Python). We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior.
Deep Unsupervised Learning on a Desktop PC: A Primer for Cognitive Scientists
Testolin, Alberto; Stoianov, Ivilin; De Filippo De Grazia, Michele; Zorzi, Marco
2013-01-01
Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programing parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low cost graphic cards (graphic processor units) without any specific programing effort, thanks to the use of high-level programming routines (available in MATLAB or Python). We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior. PMID:23653617
Data Structures for Extreme Scale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahan, Simon
As computing problems of national importance grow, the government meets the increased demand by funding the development of ever larger systems. The overarching goal of the work supported in part by this grant is to increase efficiency of programming and performing computations on these large computing systems. In past work, we have demonstrated that some of these computations once thought to require expensive hardware designs and/or complex, special-purpose programming may be executed efficiently on low-cost commodity cluster computing systems using a general-purpose “latency-tolerant” programming framework. One important developed application of the ideas underlying this framework is graph database technology supportingmore » social network pattern matching used by US intelligence agencies to more quickly identify potential terrorist threats. This database application has been spun out by the Pacific Northwest National Laboratory, a Department of Energy Laboratory, into a commercial start-up, Trovares Inc. We explore an alternative application of the same underlying ideas to a well-studied challenge arising in engineering: solving unstructured sparse linear equations. Solving these equations is key to predicting the behavior of large electronic circuits before they are fabricated. Predicting that behavior ahead of fabrication means that designs can optimized and errors corrected ahead of the expense of manufacture.« less
Jensen, Erik C.; Stockton, Amanda M.; Chiesl, Thomas N.; Kim, Jungkyu; Bera, Abhisek; Mathies, Richard A.
2013-01-01
A digitally programmable microfluidic Automaton consisting of a 2-dimensional array of pneumatically actuated microvalves is programmed to perform new multiscale mixing and sample processing operations. Large (µL-scale) volume processing operations are enabled by precise metering of multiple reagents within individual nL-scale valves followed by serial repetitive transfer to programmed locations in the array. A novel process exploiting new combining valve concepts is developed for continuous rapid and complete mixing of reagents in less than 800 ms. Mixing, transfer, storage, and rinsing operations are implemented combinatorially to achieve complex assay automation protocols. The practical utility of this technology is demonstrated by performing automated serial dilution for quantitative analysis as well as the first demonstration of on-chip fluorescent derivatization of biomarker targets (carboxylic acids) for microchip capillary electrophoresis on the Mars Organic Analyzer. A language is developed to describe how unit operations are combined to form a microfluidic program. Finally, this technology is used to develop a novel microfluidic 6-sample processor for combinatorial mixing of large sets (>26 unique combinations) of reagents. The digitally programmable microfluidic Automaton is a versatile programmable sample processor for a wide range of process volumes, for multiple samples, and for different types of analyses. PMID:23172232
Developing NDE Techniques for Large Cryogenic Tanks
NASA Technical Reports Server (NTRS)
Parker, Don; Starr, Stan; Arens, Ellen
2011-01-01
The Shuttle Program requires very large cryogenic ground storage tanks in which to store liquid oxygen and hydrogen. The existing Pads A and B Launch Complex-39 tanks, which will be passed onto future launch programs, are 45 years old and have received minimal refurbishment and only external inspections over the years. The majority of the structure is inaccessible without a full system drain of cryogenic liquid and granular insulation in the annular region. It was previously thought that there was a limit to the number of temperature cycles that the tanks could handle due to possible insulation compaction before undergoing a costly and time consuming complete overhaul; therefore the tanks were not drained and performance issues with these tanks, specifically the Pad B liquid hydrogen tank, were accepted. There is a needind an opportunity, as the Shuttle program ends and work to upgrade the launch pads progresses, to develop innovative non-destructive evaluation (NDE) techniques to analyze the current tanks. Techniques are desired that can aid in determining the extent of refurbishment required to keep the tanks in service for another 20+ years. A nondestructive technique would also be a significant aid in acceptance testing of new and refurbished tanks, saving significant time and money, if corrective actions can be taken before cryogen is introduced to the systems.
Hands-On Universe: A Global Program for Education and Public Outreach in Astronomy
NASA Astrophysics Data System (ADS)
Boër, M.; Thiébaut, C.; Pack, H.; Pennypaker, C.; Isaac, M.; Melchior, A.-L.; Faye, S.; Ebisuzaki, T.
Hands-On Universe (HOU) is an educational program that enables students to investigate the Universe while applying tools and concepts from science, math, and technology. Using the Internet, HOU participants around the world request observations from an automated telescope, download images from a large image archive, and analyze them with the aid of user-friendly image processing software. This program is now in many countries, including the USA, France, Germany, Sweden, Japan, and Australia. A network of telescopes has been established, many of them remotely operated. Students in the classroom are able to make night observations during the day, using a telescope in another country. An archive of images taken on large telescopes is also accessible, as well as resources for teachers. Students deal with real research projects, e.g., the search for asteroids, which resulted in the discovery of a Kuiper Belt object by high-school students. Not only does Hands-On Universe give the general public access to professional astronomy, it also demonstrates the use of a complex automated system, data processing techniques, and automation. Using telescopes located in many countries over the globe, a powerful and genuine cooperation between teachers and children from various countries is promoted, with a clear educational goal.
2016-04-30
fåÑçêãÉÇ=`Ü~åÖÉ= - 194 - Panel 16. Improving Governance of Complex Systems Acquisition Thursday, May 5, 2016 11:15 a.m. – 12:45 p.m. Chair: Rear...Admiral David Gale, USN, Program Executive Officer, SHIPS Complex System Governance for Acquisition Joseph Bradley, President, Leading Change, LLC...Bryan Moser, Lecturer, MIT John Dickmann, Vice President, Sonalysts Inc. A Complex Systems Perspective of Risk Mitigation and Modeling in
DNAproDB: an interactive tool for structural analysis of DNA–protein complexes
Sagendorf, Jared M.
2017-01-01
Abstract Many biological processes are mediated by complex interactions between DNA and proteins. Transcription factors, various polymerases, nucleases and histones recognize and bind DNA with different levels of binding specificity. To understand the physical mechanisms that allow proteins to recognize DNA and achieve their biological functions, it is important to analyze structures of DNA–protein complexes in detail. DNAproDB is a web-based interactive tool designed to help researchers study these complexes. DNAproDB provides an automated structure-processing pipeline that extracts structural features from DNA–protein complexes. The extracted features are organized in structured data files, which are easily parsed with any programming language or viewed in a browser. We processed a large number of DNA–protein complexes retrieved from the Protein Data Bank and created the DNAproDB database to store this data. Users can search the database by combining features of the DNA, protein or DNA–protein interactions at the interface. Additionally, users can upload their own structures for processing privately and securely. DNAproDB provides several interactive and customizable tools for creating visualizations of the DNA–protein interface at different levels of abstraction that can be exported as high quality figures. All functionality is documented and freely accessible at http://dnaprodb.usc.edu. PMID:28431131
Multimission Software Reuse in an Environment of Large Paradigm Shifts
NASA Technical Reports Server (NTRS)
Wilson, Robert K.
1996-01-01
The ground data systems provided for NASA space mission support are discussed. As space missions expand, the ground systems requirements become more complex. Current ground data systems provide for telemetry, command, and uplink and downlink processing capabilities. The new millennium project (NMP) technology testbed for 21st century NASA missions is discussed. The program demonstrates spacecraft and ground system technologies. The paradigm shift from detailed ground sequencing to a goal oriented planning approach is considered. The work carried out to meet this paradigm for the Deep Space-1 (DS-1) mission is outlined.
PedVizApi: a Java API for the interactive, visual analysis of extended pedigrees.
Fuchsberger, Christian; Falchi, Mario; Forer, Lukas; Pramstaller, Peter P
2008-01-15
PedVizApi is a Java API (application program interface) for the visual analysis of large and complex pedigrees. It provides all the necessary functionality for the interactive exploration of extended genealogies. While available packages are mostly focused on a static representation or cannot be added to an existing application, PedVizApi is a highly flexible open source library for the efficient construction of visual-based applications for the analysis of family data. An extensive demo application and a R interface is provided. http://www.pedvizapi.org
1983-07-01
complex suite of physiological and 39 respiratory adaptations (Ultch 1976) enable this species to occupy both open water and littoral zone environments...be an adaptation to avoid fish predation on juveniles in open water and may explain the spring peak in funnel trap captures near shore (Fig. 10). 79...population of C. picta probably is not established on Lake Conway. The one collected individual 103 defecated gastropod (Viviparous sp.) shells and
Etiopathogenesis of Canine Hip Dysplasia, Prevalence, and Genetics.
King, Michael D
2017-07-01
First identified in 1935, canine hip dysplasia is thought to be the most common orthopedic condition diagnosed in the dog. It is most prevalent in large and giant breed dogs, with a complex polygenic mode of inheritance, and relatively low heritability. External factors including caloric intake when growing have a significant effect on phenotypic expression. Initial joint laxity progresses to osteoarthritis due to subluxation and abnormal wearing. Selective breeding programs to attempt to decrease prevalence have shown modest results so far. Copyright © 2017 Elsevier Inc. All rights reserved.
High-energy physics software parallelization using database techniques
NASA Astrophysics Data System (ADS)
Argante, E.; van der Stok, P. D. V.; Willers, I.
1997-02-01
A programming model for software parallelization, called CoCa, is introduced that copes with problems caused by typical features of high-energy physics software. By basing CoCa on the database transaction paradimg, the complexity induced by the parallelization is for a large part transparent to the programmer, resulting in a higher level of abstraction than the native message passing software. CoCa is implemented on a Meiko CS-2 and on a SUN SPARCcenter 2000 parallel computer. On the CS-2, the performance is comparable with the performance of native PVM and MPI.
Hazardous Materials Pharmacies - A Vital Component of a Robust P2 Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCarter, S.
2006-07-01
Integrating pollution prevention (P2) into the Department of Energy Integrated Safety Management (ISM) - Environmental Management System (EMS) approach, required by DOE Order 450.1, leads to an enhanced ISM program at large and complex installations and facilities. One of the building blocks to integrating P2 into a comprehensive environmental and safety program is the control and tracking of the amounts, types, and flow of hazardous materials used on a facility. Hazardous materials pharmacies (typically called HazMarts) provide a solid approach to resolving this issue through business practice changes that reduce use, avoid excess, and redistribute surplus. If understood from conceptmore » to implementation, the HazMart is a powerful tool for reducing pollution at the source, tracking inventory storage, controlling usage and flow, and summarizing data for reporting requirements. Pharmacy options can range from a strict, single control point for all hazardous materials to a virtual system, where the inventory is user controlled and reported over a common system. Designing and implementing HazMarts on large, diverse installations or facilities present a unique set of issues. This is especially true of research and development (R and D) facilities where the chemical use requirements are extensive and often classified. There are often multiple sources of supply; a wide variety of chemical requirements; a mix of containers ranging from small ampoules to large bulk storage tanks; and a wide range of tools used to track hazardous materials, ranging from simple purchase inventories to sophisticated tracking software. Computer systems are often not uniform in capacity, capability, or operating systems, making it difficult to use a server-based unified tracking system software. Each of these issues has a solution or set of solutions tied to fundamental business practices. Each requires an understanding of the problem at hand, which, in turn, requires good communication among all potential users. A key attribute to a successful HazMart is that everybody must use the same program. That requirement often runs directly into the biggest issue of all... institutional resistance to change. To be successful, the program has to be both a top-down and bottom-up driven process. The installation or facility must set the policy and the requirement, but all of the players have to buy in and participate in building and implementing the program. Dynamac's years of experience assessing hazardous materials programs, providing business case analyses, and recommending and implementing pharmacy approaches for federal agencies has provided us with key insights into the issues, problems, and the array of solutions available. This paper presents the key steps required to implement a HazMart, explores the advantages and pitfalls associated with a HazMart, and presents some options for implementing a pharmacy or HazMart on complex installations and R and D facilities. (authors)« less
Riegel, Barbara; Lee, Christopher S; Sochalski, Julie
2010-05-01
Comparing disease management programs and their effects is difficult because of wide variability in program intensity and complexity. The purpose of this effort was to develop an instrument that can be used to describe the intensity and complexity of heart failure (HF) disease management programs. Specific composition criteria were taken from the American Heart Association (AHA) taxonomy of disease management and hierarchically scored to allow users to describe the intensity and complexity of the domains and subdomains of HF disease management programs. The HF Disease Management Scoring Instrument (HF-DMSI) incorporates 6 of the 8 domains from the taxonomy: recipient, intervention content, delivery personnel, method of communication, intensity/complexity, and environment. The 3 intervention content subdomains (education/counseling, medication management, and peer support) are described separately. In this first test of the HF-DMSI, overall intensity (measured as duration) and complexity were rated using an ordinal scoring system. Possible scores reflect a clinical rationale and differ by category, with zero given only if the element could potentially be missing (eg, surveillance by remote monitoring). Content validity was evident as the instrument matches the existing AHA taxonomy. After revision and refinement, 2 authors obtained an inter-rater reliability intraclass correlation coefficient score of 0.918 (confidence interval, 0.880 to 0.944, P<0.001) in their rating of 12 studies. The areas with most variability among programs were delivery personnel and method of communication. The HF-DMSI is useful for describing the intensity and complexity of HF disease management programs.
Mishra, Bud; Daruwala, Raoul-Sam; Zhou, Yi; Ugel, Nadia; Policriti, Alberto; Antoniotti, Marco; Paxia, Salvatore; Rejali, Marc; Rudra, Archisman; Cherepinsky, Vera; Silver, Naomi; Casey, William; Piazza, Carla; Simeoni, Marta; Barbano, Paolo; Spivak, Marina; Feng, Jiawu; Gill, Ofer; Venkatesh, Mysore; Cheng, Fang; Sun, Bing; Ioniata, Iuliana; Anantharaman, Thomas; Hubbard, E Jane Albert; Pnueli, Amir; Harel, David; Chandru, Vijay; Hariharan, Ramesh; Wigler, Michael; Park, Frank; Lin, Shih-Chieh; Lazebnik, Yuri; Winkler, Franz; Cantor, Charles R; Carbone, Alessandra; Gromov, Mikhael
2003-01-01
We collaborate in a research program aimed at creating a rigorous framework, experimental infrastructure, and computational environment for understanding, experimenting with, manipulating, and modifying a diverse set of fundamental biological processes at multiple scales and spatio-temporal modes. The novelty of our research is based on an approach that (i) requires coevolution of experimental science and theoretical techniques and (ii) exploits a certain universality in biology guided by a parsimonious model of evolutionary mechanisms operating at the genomic level and manifesting at the proteomic, transcriptomic, phylogenic, and other higher levels. Our current program in "systems biology" endeavors to marry large-scale biological experiments with the tools to ponder and reason about large, complex, and subtle natural systems. To achieve this ambitious goal, ideas and concepts are combined from many different fields: biological experimentation, applied mathematical modeling, computational reasoning schemes, and large-scale numerical and symbolic simulations. From a biological viewpoint, the basic issues are many: (i) understanding common and shared structural motifs among biological processes; (ii) modeling biological noise due to interactions among a small number of key molecules or loss of synchrony; (iii) explaining the robustness of these systems in spite of such noise; and (iv) cataloging multistatic behavior and adaptation exhibited by many biological processes.
Proteomics wants cRacker: automated standardized data analysis of LC-MS derived proteomic data.
Zauber, Henrik; Schulze, Waltraud X
2012-11-02
The large-scale analysis of thousands of proteins under various experimental conditions or in mutant lines has gained more and more importance in hypothesis-driven scientific research and systems biology in the past years. Quantitative analysis by large scale proteomics using modern mass spectrometry usually results in long lists of peptide ion intensities. The main interest for most researchers, however, is to draw conclusions on the protein level. Postprocessing and combining peptide intensities of a proteomic data set requires expert knowledge, and the often repetitive and standardized manual calculations can be time-consuming. The analysis of complex samples can result in very large data sets (lists with several 1000s to 100,000 entries of different peptides) that cannot easily be analyzed using standard spreadsheet programs. To improve speed and consistency of the data analysis of LC-MS derived proteomic data, we developed cRacker. cRacker is an R-based program for automated downstream proteomic data analysis including data normalization strategies for metabolic labeling and label free quantitation. In addition, cRacker includes basic statistical analysis, such as clustering of data, or ANOVA and t tests for comparison between treatments. Results are presented in editable graphic formats and in list files.
Architecture independent environment for developing engineering software on MIMD computers
NASA Technical Reports Server (NTRS)
Valimohamed, Karim A.; Lopez, L. A.
1990-01-01
Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.
A depth-first search algorithm to compute elementary flux modes by linear programming.
Quek, Lake-Ee; Nielsen, Lars K
2014-07-30
The decomposition of complex metabolic networks into elementary flux modes (EFMs) provides a useful framework for exploring reaction interactions systematically. Generating a complete set of EFMs for large-scale models, however, is near impossible. Even for moderately-sized models (<400 reactions), existing approaches based on the Double Description method must iterate through a large number of combinatorial candidates, thus imposing an immense processor and memory demand. Based on an alternative elementarity test, we developed a depth-first search algorithm using linear programming (LP) to enumerate EFMs in an exhaustive fashion. Constraints can be introduced to directly generate a subset of EFMs satisfying the set of constraints. The depth-first search algorithm has a constant memory overhead. Using flux constraints, a large LP problem can be massively divided and parallelized into independent sub-jobs for deployment into computing clusters. Since the sub-jobs do not overlap, the approach scales to utilize all available computing nodes with minimal coordination overhead or memory limitations. The speed of the algorithm was comparable to efmtool, a mainstream Double Description method, when enumerating all EFMs; the attrition power gained from performing flux feasibility tests offsets the increased computational demand of running an LP solver. Unlike the Double Description method, the algorithm enables accelerated enumeration of all EFMs satisfying a set of constraints.
NASA Astrophysics Data System (ADS)
Xu, Chuanfu; Deng, Xiaogang; Zhang, Lilun; Fang, Jianbin; Wang, Guangxue; Jiang, Yi; Cao, Wei; Che, Yonggang; Wang, Yongxian; Wang, Zhenghua; Liu, Wei; Cheng, Xinghua
2014-12-01
Programming and optimizing complex, real-world CFD codes on current many-core accelerated HPC systems is very challenging, especially when collaborating CPUs and accelerators to fully tap the potential of heterogeneous systems. In this paper, with a tri-level hybrid and heterogeneous programming model using MPI + OpenMP + CUDA, we port and optimize our high-order multi-block structured CFD software HOSTA on the GPU-accelerated TianHe-1A supercomputer. HOSTA adopts two self-developed high-order compact definite difference schemes WCNS and HDCS that can simulate flows with complex geometries. We present a dual-level parallelization scheme for efficient multi-block computation on GPUs and perform particular kernel optimizations for high-order CFD schemes. The GPU-only approach achieves a speedup of about 1.3 when comparing one Tesla M2050 GPU with two Xeon X5670 CPUs. To achieve a greater speedup, we collaborate CPU and GPU for HOSTA instead of using a naive GPU-only approach. We present a novel scheme to balance the loads between the store-poor GPU and the store-rich CPU. Taking CPU and GPU load balance into account, we improve the maximum simulation problem size per TianHe-1A node for HOSTA by 2.3×, meanwhile the collaborative approach can improve the performance by around 45% compared to the GPU-only approach. Further, to scale HOSTA on TianHe-1A, we propose a gather/scatter optimization to minimize PCI-e data transfer times for ghost and singularity data of 3D grid blocks, and overlap the collaborative computation and communication as far as possible using some advanced CUDA and MPI features. Scalability tests show that HOSTA can achieve a parallel efficiency of above 60% on 1024 TianHe-1A nodes. With our method, we have successfully simulated an EET high-lift airfoil configuration containing 800M cells and China's large civil airplane configuration containing 150M cells. To our best knowledge, those are the largest-scale CPU-GPU collaborative simulations that solve realistic CFD problems with both complex configurations and high-order schemes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Chuanfu, E-mail: xuchuanfu@nudt.edu.cn; Deng, Xiaogang; Zhang, Lilun
Programming and optimizing complex, real-world CFD codes on current many-core accelerated HPC systems is very challenging, especially when collaborating CPUs and accelerators to fully tap the potential of heterogeneous systems. In this paper, with a tri-level hybrid and heterogeneous programming model using MPI + OpenMP + CUDA, we port and optimize our high-order multi-block structured CFD software HOSTA on the GPU-accelerated TianHe-1A supercomputer. HOSTA adopts two self-developed high-order compact definite difference schemes WCNS and HDCS that can simulate flows with complex geometries. We present a dual-level parallelization scheme for efficient multi-block computation on GPUs and perform particular kernel optimizations formore » high-order CFD schemes. The GPU-only approach achieves a speedup of about 1.3 when comparing one Tesla M2050 GPU with two Xeon X5670 CPUs. To achieve a greater speedup, we collaborate CPU and GPU for HOSTA instead of using a naive GPU-only approach. We present a novel scheme to balance the loads between the store-poor GPU and the store-rich CPU. Taking CPU and GPU load balance into account, we improve the maximum simulation problem size per TianHe-1A node for HOSTA by 2.3×, meanwhile the collaborative approach can improve the performance by around 45% compared to the GPU-only approach. Further, to scale HOSTA on TianHe-1A, we propose a gather/scatter optimization to minimize PCI-e data transfer times for ghost and singularity data of 3D grid blocks, and overlap the collaborative computation and communication as far as possible using some advanced CUDA and MPI features. Scalability tests show that HOSTA can achieve a parallel efficiency of above 60% on 1024 TianHe-1A nodes. With our method, we have successfully simulated an EET high-lift airfoil configuration containing 800M cells and China's large civil airplane configuration containing 150M cells. To our best knowledge, those are the largest-scale CPU–GPU collaborative simulations that solve realistic CFD problems with both complex configurations and high-order schemes.« less
Fault management for data systems
NASA Technical Reports Server (NTRS)
Boyd, Mark A.; Iverson, David L.; Patterson-Hine, F. Ann
1993-01-01
Issues related to automating the process of fault management (fault diagnosis and response) for data management systems are considered. Substantial benefits are to be gained by successful automation of this process, particularly for large, complex systems. The use of graph-based models to develop a computer assisted fault management system is advocated. The general problem is described and the motivation behind choosing graph-based models over other approaches for developing fault diagnosis computer programs is outlined. Some existing work in the area of graph-based fault diagnosis is reviewed, and a new fault management method which was developed from existing methods is offered. Our method is applied to an automatic telescope system intended as a prototype for future lunar telescope programs. Finally, an application of our method to general data management systems is described.
HOBYS and W43-HERO: Two more steps toward a Galaxy-wide understanding of high-mass star formation
NASA Astrophysics Data System (ADS)
Motte, Frédérique; Bontemps, Sylvain; Tigé, Jérémy
The Herschel/HOBYS key program allows to statistically study the formation of 10-20 M ⊙ stars. The IRAM/W43-HERO large program is itself dedicated to the much more extreme W43 molecular complex, which forms stars up to 50 M ⊙. Both reveal high-density cloud filaments of several pc3, which are forming clusters of OB-type stars. Given their activity, these so-called mini-starburst cloud ridges could be seen as ``miniature and instant models'' of starburst galaxies. Both surveys also strongly suggest that high-mass prestellar cores do not exist, in agreement with the dynamical formation of cloud ridges. The HOBYS and W43 surveys are necessary steps towards Galaxy-wide studies of high-mass star formation.
Physics through the 1990s: Scientific interfaces and technological applications
NASA Technical Reports Server (NTRS)
1986-01-01
The volume examines the scientific interfaces and technological applications of physics. Twelve areas are dealt with: biological physics-biophysics, the brain, and theoretical biology; the physics-chemistry interface-instrumentation, surfaces, neutron and synchrotron radiation, polymers, organic electronic materials; materials science; geophysics-tectonics, the atmosphere and oceans, planets, drilling and seismic exploration, and remote sensing; computational physics-complex systems and applications in basic research; mathematics-field theory and chaos; microelectronics-integrated circuits, miniaturization, future trends; optical information technologies-fiber optics and photonics; instrumentation; physics applications to energy needs and the environment; national security-devices, weapons, and arms control; medical physics-radiology, ultrasonics, MNR, and photonics. An executive summary and many chapters contain recommendations regarding funding, education, industry participation, small-group university research and large facility programs, government agency programs, and computer database needs.
Experimental study of spectral and spatial distribution of solar X-rays
NASA Technical Reports Server (NTRS)
Acton, L. W.; Catura, R. C.; Culhane, J. L.
1972-01-01
The study of the physical conditions within the solar corona and the development of instrumentation and technical expertise necessary for advanced studies of solar X-ray emission are reported. Details are given on the Aerobee-borne-X-ray spectrometer/monochromator and also on the observing program. Preliminary discussions of some results are presented and include studies of helium-like line emission, mapping O(VII) and Ne(IX) lines, survey of O(VII) and Ne(IX) lines, study of plage regions and small flares, and analysis of line emission from individual active regions. It is concluded that the use of large-area collimated Bragg spectrometers to scan narrow wavelength intervals and the capability of the SPARCS pointing control to execute a complex observing program are established.
Leverage hadoop framework for large scale clinical informatics applications.
Dong, Xiao; Bahroos, Neil; Sadhu, Eugene; Jackson, Tommie; Chukhman, Morris; Johnson, Robert; Boyd, Andrew; Hynes, Denise
2013-01-01
In this manuscript, we present our experiences using the Apache Hadoop framework for high data volume and computationally intensive applications, and discuss some best practice guidelines in a clinical informatics setting. There are three main aspects in our approach: (a) process and integrate diverse, heterogeneous data sources using standard Hadoop programming tools and customized MapReduce programs; (b) after fine-grained aggregate results are obtained, perform data analysis using the Mahout data mining library; (c) leverage the column oriented features in HBase for patient centric modeling and complex temporal reasoning. This framework provides a scalable solution to meet the rapidly increasing, imperative "Big Data" needs of clinical and translational research. The intrinsic advantage of fault tolerance, high availability and scalability of Hadoop platform makes these applications readily deployable at the enterprise level cluster environment.
The Conceptual Complexity of Vocabulary in Elementary-Grades Core Science Program Textbooks
ERIC Educational Resources Information Center
Fitzgerald, W. Jill; Elmore, Jeff; Kung, Melody; Stenner, A. Jackson
2017-01-01
The researchers explored the conceptual complexity of vocabulary in contemporary elementary-grades core science program textbooks to address two research questions: (1) Can a progression of concepts' complexity level be described across grades? (2) Was there gradual developmental growth of the most complex concepts' networks of associated concepts…
Pembleton, Luke W; Inch, Courtney; Baillie, Rebecca C; Drayton, Michelle C; Thakur, Preeti; Ogaji, Yvonne O; Spangenberg, German C; Forster, John W; Daetwyler, Hans D; Cogan, Noel O I
2018-06-02
Exploitation of data from a ryegrass breeding program has enabled rapid development and implementation of genomic selection for sward-based biomass yield with a twofold-to-threefold increase in genetic gain. Genomic selection, which uses genome-wide sequence polymorphism data and quantitative genetics techniques to predict plant performance, has large potential for the improvement in pasture plants. Major factors influencing the accuracy of genomic selection include the size of reference populations, trait heritability values and the genetic diversity of breeding populations. Global diversity of the important forage species perennial ryegrass is high and so would require a large reference population in order to achieve moderate accuracies of genomic selection. However, diversity of germplasm within a breeding program is likely to be lower. In addition, de novo construction and characterisation of reference populations are a logistically complex process. Consequently, historical phenotypic records for seasonal biomass yield and heading date over a 18-year period within a commercial perennial ryegrass breeding program have been accessed, and target populations have been characterised with a high-density transcriptome-based genotyping-by-sequencing assay. Ability to predict observed phenotypic performance in each successive year was assessed by using all synthetic populations from previous years as a reference population. Moderate and high accuracies were achieved for the two traits, respectively, consistent with broad-sense heritability values. The present study represents the first demonstration and validation of genomic selection for seasonal biomass yield within a diverse commercial breeding program across multiple years. These results, supported by previous simulation studies, demonstrate the ability to predict sward-based phenotypic performance early in the process of individual plant selection, so shortening the breeding cycle, increasing the rate of genetic gain and allowing rapid adoption in ryegrass improvement programs.
Palazuelos, Daniel; DaEun Im, Dana; Peckarsky, Matthew; Schwarz, Dan; Farmer, Didi Bertrand; Dhillon, Ranu; Johnson, Ari; Orihuela, Claudia; Hackett, Jill; Bazile, Junior; Berman, Leslie; Ballard, Madeleine; Panjabi, Raj; Ternier, Ralph; Slavin, Sam; Lee, Scott; Selinsky, Steve; Mitnick, Carole Diane
2013-01-01
Introduction Despite decades of experience with community health workers (CHWs) in a wide variety of global health projects, there is no established conceptual framework that structures how implementers and researchers can understand, study and improve their respective programs based on lessons learned by other CHW programs. Objective To apply an original, non-linear framework and case study method, 5-SPICE, to multiple sister projects of a large, international non-governmental organization (NGO), and other CHW projects. Design Engaging a large group of implementers, researchers and the best available literature, the 5-SPICE framework was refined and then applied to a selection of CHW programs. Insights gleaned from the case study method were summarized in a tabular format named the ‘5×5-SPICE chart’. This format graphically lists the ways in which essential CHW program elements interact, both positively and negatively, in the implementation field. Results The 5×5-SPICE charts reveal a variety of insights that come from a more complex understanding of how essential CHW projects interact and influence each other in their unique context. Some have been well described in the literature previously, while others are exclusive to this article. An analysis of how best to compensate CHWs is also offered as an example of the type of insights that this method may yield. Conclusions The 5-SPICE framework is a novel instrument that can be used to guide discussions about CHW projects. Insights from this process can help guide quality improvement efforts, or be used as hypothesis that will form the basis of a program's research agenda. Recent experience with research protocols embedded into successfully implemented projects demonstrates how such hypothesis can be rigorously tested. PMID:23561023
High performance computing and communications: Advancing the frontiers of information technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-12-31
This report, which supplements the President`s Fiscal Year 1997 Budget, describes the interagency High Performance Computing and Communications (HPCC) Program. The HPCC Program will celebrate its fifth anniversary in October 1996 with an impressive array of accomplishments to its credit. Over its five-year history, the HPCC Program has focused on developing high performance computing and communications technologies that can be applied to computation-intensive applications. Major highlights for FY 1996: (1) High performance computing systems enable practical solutions to complex problems with accuracies not possible five years ago; (2) HPCC-funded research in very large scale networking techniques has been instrumental inmore » the evolution of the Internet, which continues exponential growth in size, speed, and availability of information; (3) The combination of hardware capability measured in gigaflop/s, networking technology measured in gigabit/s, and new computational science techniques for modeling phenomena has demonstrated that very large scale accurate scientific calculations can be executed across heterogeneous parallel processing systems located thousands of miles apart; (4) Federal investments in HPCC software R and D support researchers who pioneered the development of parallel languages and compilers, high performance mathematical, engineering, and scientific libraries, and software tools--technologies that allow scientists to use powerful parallel systems to focus on Federal agency mission applications; and (5) HPCC support for virtual environments has enabled the development of immersive technologies, where researchers can explore and manipulate multi-dimensional scientific and engineering problems. Educational programs fostered by the HPCC Program have brought into classrooms new science and engineering curricula designed to teach computational science. This document contains a small sample of the significant HPCC Program accomplishments in FY 1996.« less
Optimizing liquid effluent monitoring at a large nuclear complex.
Chou, Charissa J; Barnett, D Brent; Johnson, Vernon G; Olson, Phil M
2003-12-01
Effluent monitoring typically requires a large number of analytes and samples during the initial or startup phase of a facility. Once a baseline is established, the analyte list and sampling frequency may be reduced. Although there is a large body of literature relevant to the initial design, few, if any, published papers exist on updating established effluent monitoring programs. This paper statistically evaluates four years of baseline data to optimize the liquid effluent monitoring efficiency of a centralized waste treatment and disposal facility at a large defense nuclear complex. Specific objectives were to: (1) assess temporal variability in analyte concentrations, (2) determine operational factors contributing to waste stream variability, (3) assess the probability of exceeding permit limits, and (4) streamline the sampling and analysis regime. Results indicated that the probability of exceeding permit limits was one in a million under normal facility operating conditions, sampling frequency could be reduced, and several analytes could be eliminated. Furthermore, indicators such as gross alpha and gross beta measurements could be used in lieu of more expensive specific isotopic analyses (radium, cesium-137, and strontium-90) for routine monitoring. Study results were used by the state regulatory agency to modify monitoring requirements for a new discharge permit, resulting in an annual cost savings of US dollars 223,000. This case study demonstrates that statistical evaluation of effluent contaminant variability coupled with process knowledge can help plant managers and regulators streamline analyte lists and sampling frequencies based on detection history and environmental risk.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The PROJECT proposes to install a new TCS micronized coal-fired heating plant for the Produkcja I Hodowla Roslin Ogrodniczych (PHRO) Greenhouse Complex; Krzeszowice, Poland (about 20 miles west of Krakow). PHRO currently utilizes 14 heavy oil-fired boilers to produce heat for its greenhouse facilities and also home heating to several adjacent apartment housing complexes. The boilers currently burn a high-sulfur content heavy crude oil, called Mazute. For size orientation, the PHRO Greenhouse complex grows a variety of vegetables and flowers for the Southern Poland marketplace. The greenhouse area under glass is very large and equivalent to approximately 50 football fields.more » The new micronized coal fired boiler would: (1) provide a significant portion of the heat for PHRO and a portion of the adjacent apartment housing complexes, (2) dramatically reduce sulfur dioxide air pollution emissions, while satisfying new Polish air regulations, and (3) provide attractive savings to PHRO, based on the quantity of displaced oil. Currently, the Town of Krzeszowice is considering a district heating program that would replace some, or all, of the 40 existing small in-town heating boilers that presently burn high-sulfur content coal. Potentially the district heating system can be expanded and connected into the PHRO boiler network; so that, PHRO boilers can supply all, or a portion of, the Town`s heating demand. The new TCS micronized coal system could provide a portion of this demand.« less
Structured Light-Matter Interactions Enabled By Novel Photonic Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Litchinitser, Natalia; Feng, Liang
The synergy of complex materials and complex light is expected to add a new dimension to the science of light and its applications [1]. The goal of this program is to investigate novel phenomena emerging at the interface of these two branches of modern optics. While metamaterials research was largely focused on relatively “simple” linearly or circularly polarized light propagation in “complex” nanostructured, carefully designed materials with properties not found in nature, many singular optics studies addressed “complex” structured light transmission in “simple” homogeneous, isotropic, nondispersive transparent media, where both spin and orbital angular momentum are independently conserved. However, ifmore » both light and medium are complex so that structured light interacts with a metamaterial whose optical materials properties can be designed at will, the spin or angular momentum can change, which leads to spin-orbit interaction and many novel optical phenomena that will be studied in the proposed project. Indeed, metamaterials enable unprecedented control over light propagation, opening new avenues for using spin and quantum optical phenomena, and design flexibility facilitating new linear and nonlinear optical properties and functionalities, including negative index of refraction, magnetism at optical frequencies, giant optical activity, subwavelength imaging, cloaking, dispersion engineering, and unique phase-matching conditions for nonlinear optical interactions. In this research program we focused on structured light-matter interactions in complex media with three particularly remarkable properties that were enabled only with the emergence of metamaterials: extreme anisotropy, extreme material parameters, and magneto-electric coupling–bi-anisotropy and chirality.« less
Heidema, A Geert; Boer, Jolanda M A; Nagelkerke, Nico; Mariman, Edwin C M; van der A, Daphne L; Feskens, Edith J M
2006-04-21
Genetic epidemiologists have taken the challenge to identify genetic polymorphisms involved in the development of diseases. Many have collected data on large numbers of genetic markers but are not familiar with available methods to assess their association with complex diseases. Statistical methods have been developed for analyzing the relation between large numbers of genetic and environmental predictors to disease or disease-related variables in genetic association studies. In this commentary we discuss logistic regression analysis, neural networks, including the parameter decreasing method (PDM) and genetic programming optimized neural networks (GPNN) and several non-parametric methods, which include the set association approach, combinatorial partitioning method (CPM), restricted partitioning method (RPM), multifactor dimensionality reduction (MDR) method and the random forests approach. The relative strengths and weaknesses of these methods are highlighted. Logistic regression and neural networks can handle only a limited number of predictor variables, depending on the number of observations in the dataset. Therefore, they are less useful than the non-parametric methods to approach association studies with large numbers of predictor variables. GPNN on the other hand may be a useful approach to select and model important predictors, but its performance to select the important effects in the presence of large numbers of predictors needs to be examined. Both the set association approach and random forests approach are able to handle a large number of predictors and are useful in reducing these predictors to a subset of predictors with an important contribution to disease. The combinatorial methods give more insight in combination patterns for sets of genetic and/or environmental predictor variables that may be related to the outcome variable. As the non-parametric methods have different strengths and weaknesses we conclude that to approach genetic association studies using the case-control design, the application of a combination of several methods, including the set association approach, MDR and the random forests approach, will likely be a useful strategy to find the important genes and interaction patterns involved in complex diseases.
36 CFR 800.14 - Federal agency program alternatives.
Code of Federal Regulations, 2010 CFR
2010-07-01
... program or the resolution of adverse effects from certain complex project situations or multiple... by the agreement. (3) Developing programmatic agreements for complex or multiple undertakings. Consultation to develop a programmatic agreement for dealing with the potential adverse effects of complex...
AutoBayes Program Synthesis System System Internals
NASA Technical Reports Server (NTRS)
Schumann, Johann Martin
2011-01-01
This lecture combines the theoretical background of schema based program synthesis with the hands-on study of a powerful, open-source program synthesis system (Auto-Bayes). Schema-based program synthesis is a popular approach toward program synthesis. The lecture will provide an introduction into this topic and discuss how this technology can be used to generate customized algorithms. The synthesis of advanced numerical algorithms requires the availability of a powerful symbolic (algebra) system. Its task is to symbolically solve equations, simplify expressions, or to symbolically calculate derivatives (among others) such that the synthesized algorithms become as efficient as possible. We will discuss the use and importance of the symbolic system for synthesis. Any synthesis system is a large and complex piece of code. In this lecture, we will study Autobayes in detail. AutoBayes has been developed at NASA Ames and has been made open source. It takes a compact statistical specification and generates a customized data analysis algorithm (in C/C++) from it. AutoBayes is written in SWI Prolog and many concepts from rewriting, logic, functional, and symbolic programming. We will discuss the system architecture, the schema libary and the extensive support infra-structure. Practical hands-on experiments and exercises will enable the student to get insight into a realistic program synthesis system and provides knowledge to use, modify, and extend Autobayes.
MrEnt: an editor for publication-quality phylogenetic tree illustrations.
Zuccon, Alessandro; Zuccon, Dario
2014-09-01
We developed MrEnt, a Windows-based, user-friendly software that allows the production of complex, high-resolution, publication-quality phylogenetic trees in few steps, directly from the analysis output. The program recognizes the standard Nexus tree format and the annotated tree files produced by BEAST and MrBayes. MrEnt combines in a single software a large suite of tree manipulation functions (e.g. handling of multiple trees, tree rotation, character mapping, node collapsing, compression of large clades, handling of time scale and error bars for chronograms) with drawing tools typical of standard graphic editors, including handling of graphic elements and images. The tree illustration can be printed or exported in several standard formats suitable for journal publication, PowerPoint presentation or Web publication. © 2014 John Wiley & Sons Ltd.
School lunch program in India: background, objectives and components.
Chutani, Alka Mohan
2012-01-01
The School Lunch Program in India (SLP) is the largest food and nutrition assistance program feeding millions of children every day. This paper provides a review of the background information on the SLP in India earlier known as national program for nutrition support to primary education (NP-NSPE) and later as mid day meal scheme, including historical trends and objectives and components/characteristics of the scheme. It also addresses steps being taken to meet challenges being faced by the administrators of the program in monitoring and evaluation of the program. This program was initially started in 1960 in few states to overcome the complex problems malnutrition and illiteracy. Mid Day Meal Scheme is the popular name for school meal program. In 2001, as per the supreme court orders, it became mandatory to give a mid day meal to all primary and later extended to upper primary school children studying in the government and government aided schools. This scheme benefitted 140 million children in government assisted schools across India in 2008, strengthening child nutrition and literacy. In a country with a large percent of illiterate population with a high percent of children unable to read or write; governmental and non-governmental organizations have reported that mid day meal scheme has consistently increased enrollment in schools in India. One of the main goals of school lunch program is to promote the health and well-being of the Nation's children.
When weight management lasts. Lower perceived rule complexity increases adherence.
Mata, Jutta; Todd, Peter M; Lippke, Sonia
2010-02-01
Maintaining behavior change is one of the major challenges in weight management and long-term weight loss. We investigated the impact of the cognitive complexity of eating rules on adherence to weight management programs. We studied whether popular weight management programs can fail if participants find the rules too complicated from a cognitive perspective, meaning that individuals are not able to recall or process all required information for deciding what to eat. The impact on program adherence of participants' perceptions of eating rule complexity and other behavioral factors known to influence adherence (including previous weight management, self-efficacy, and planning) was assessed via a longitudinal online questionnaire given to 390 participants on two different popular weight management regimens. As we show, the regimens, Weight Watchers and a popular German recipe diet (Brigitte), strongly differ in objective rule complexity and thus their cognitive demands on the dieter. Perceived rule complexity was the strongest factor associated with increased risk of quitting the cognitively demanding weight management program (Weight Watchers); it was not related to adherence length for the low cognitive demand program (Brigitte). Higher self-efficacy generally helped in maintaining a program. The results emphasize the importance of considering rule complexity to promote long-term weight management. 2009 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Selkirk, Henry B.
2001-01-01
This report summarizes work conducted from January 1996 through April 1999 on a program of research to investigate the physical mechanisms that underlie the transport of trace constituents in the stratosphere-troposphere system. The primary scientific goal of the research has been to identify the processes which transport air masses within the lower stratosphere, particularly between the tropics and middle latitudes. This research was conducted in collaboration with the Subsonic Assessment (SASS) of the NASA Atmospheric Effects of Radiation Program (AEAP) and the Upper Atmospheric Research Program (UARP). The SASS program sought to understand the impact of the present and future fleets of conventional jet traffic on the upper troposphere and lower stratosphere, while complementary airborne observations under UARP seek to understand the complex interactions of dynamical and chemical processes that affect the ozone layer. The present investigation contributed to the goals of each of these by diagnosing the history of air parcels intercepted by NASA research aircraft in UARP and AEAP campaigns. This was done by means of a blend of trajectory analyses and tracer correlation techniques.
A model for managing sources of groundwater pollution
Gorelick, Steven M.
1982-01-01
The waste disposal capacity of a groundwater system can be maximized while maintaining water quality at specified locations by using a groundwater pollutant source management model that is based upon linear programing and numerical simulation. The decision variables of the management model are solute waste disposal rates at various facilities distributed over space. A concentration response matrix is used in the management model to describe transient solute transport and is developed using the U.S. Geological Survey solute transport simulation model. The management model was applied to a complex hypothetical groundwater system. Large-scale management models were formulated as dual linear programing problems to reduce numerical difficulties and computation time. Linear programing problems were solved using a numerically stable, available code. Optimal solutions to problems with successively longer management time horizons indicated that disposal schedules at some sites are relatively independent of the number of disposal periods. Optimal waste disposal schedules exhibited pulsing rather than constant disposal rates. Sensitivity analysis using parametric linear programing showed that a sharp reduction in total waste disposal potential occurs if disposal rates at any site are increased beyond their optimal values.
Using CLIPS in the domain of knowledge-based massively parallel programming
NASA Technical Reports Server (NTRS)
Dvorak, Jiri J.
1994-01-01
The Program Development Environment (PDE) is a tool for massively parallel programming of distributed-memory architectures. Adopting a knowledge-based approach, the PDE eliminates the complexity introduced by parallel hardware with distributed memory and offers complete transparency in respect of parallelism exploitation. The knowledge-based part of the PDE is realized in CLIPS. Its principal task is to find an efficient parallel realization of the application specified by the user in a comfortable, abstract, domain-oriented formalism. A large collection of fine-grain parallel algorithmic skeletons, represented as COOL objects in a tree hierarchy, contains the algorithmic knowledge. A hybrid knowledge base with rule modules and procedural parts, encoding expertise about application domain, parallel programming, software engineering, and parallel hardware, enables a high degree of automation in the software development process. In this paper, important aspects of the implementation of the PDE using CLIPS and COOL are shown, including the embedding of CLIPS with C++-based parts of the PDE. The appropriateness of the chosen approach and of the CLIPS language for knowledge-based software engineering are discussed.
Astrochemical evolution along star formation: overview of the IRAM Large Program ASAI
NASA Astrophysics Data System (ADS)
Lefloch, Bertrand; Bachiller, R.; Ceccarelli, C.; Cernicharo, J.; Codella, C.; Fuente, A.; Kahane, C.; López-Sepulcre, A.; Tafalla, M.; Vastel, C.; Caux, E.; González-García, M.; Bianchi, E.; Gómez-Ruiz, A.; Holdship, J.; Mendoza, E.; Ospina-Zamudio, J.; Podio, L.; Quénard, D.; Roueff, E.; Sakai, N.; Viti, S.; Yamamoto, S.; Yoshida, K.; Favre, C.; Monfredini, T.; Quitián-Lara, H. M.; Marcelino, N.; Boechat-Roberty, H. M.; Cabrit, S.
2018-07-01
Evidence is mounting that the small bodies of our Solar system, such as comets and asteroids, have at least partially inherited their chemical composition from the first phases of the Solar system formation. It then appears that the molecular complexity of these small bodies is most likely related to the earliest stages of star formation. It is therefore important to characterize and to understand how the chemical evolution changes with solar-type protostellar evolution. We present here the Large Program `Astrochemical Surveys At IRAM' (ASAI). Its goal is to carry out unbiased millimetre line surveys between 80 and 272 GHz of a sample of 10 template sources, which fully cover the first stages of the formation process of solar-type stars, from pre-stellar cores to the late protostellar phase. In this paper, we present an overview of the surveys and results obtained from the analysis of the 3 mm band observations. The number of detected main isotopic species barely varies with the evolutionary stage and is found to be very similar to that of massive star-forming regions. The molecular content in O- and C-bearing species allows us to define two chemical classes of envelopes, whose composition is dominated by either (a) a rich content in O-rich complex organic molecules, associated with hot corino sources, or (b) a rich content in hydrocarbons, typical of warm carbon-chain chemistry sources. Overall, a high chemical richness is found to be present already in the initial phases of solar-type star formation.
Optimal space-time attacks on system state estimation under a sparsity constraint
NASA Astrophysics Data System (ADS)
Lu, Jingyang; Niu, Ruixin; Han, Puxiao
2016-05-01
System state estimation in the presence of an adversary that injects false information into sensor readings has attracted much attention in wide application areas, such as target tracking with compromised sensors, secure monitoring of dynamic electric power systems, secure driverless cars, and radar tracking and detection in the presence of jammers. From a malicious adversary's perspective, the optimal strategy for attacking a multi-sensor dynamic system over sensors and over time is investigated. It is assumed that the system defender can perfectly detect the attacks and identify and remove sensor data once they are corrupted by false information injected by the adversary. With this in mind, the adversary's goal is to maximize the covariance matrix of the system state estimate by the end of attack period under a sparse attack constraint such that the adversary can only attack the system a few times over time and over sensors. The sparsity assumption is due to the adversary's limited resources and his/her intention to reduce the chance of being detected by the system defender. This becomes an integer programming problem and its optimal solution, the exhaustive search, is intractable with a prohibitive complexity, especially for a system with a large number of sensors and over a large number of time steps. Several suboptimal solutions, such as those based on greedy search and dynamic programming are proposed to find the attack strategies. Examples and numerical results are provided in order to illustrate the effectiveness and the reduced computational complexities of the proposed attack strategies.
Planning and executing complex large-scale exercises.
McCormick, Lisa C; Hites, Lisle; Wakelee, Jessica F; Rucks, Andrew C; Ginter, Peter M
2014-01-01
Increasingly, public health departments are designing and engaging in complex operations-based full-scale exercises to test multiple public health preparedness response functions. The Department of Homeland Security's Homeland Security Exercise and Evaluation Program (HSEEP) supplies benchmark guidelines that provide a framework for both the design and the evaluation of drills and exercises; however, the HSEEP framework does not seem to have been designed to manage the development and evaluation of multiple, operations-based, parallel exercises combined into 1 complex large-scale event. Lessons learned from the planning of the Mississippi State Department of Health Emergency Support Function--8 involvement in National Level Exercise 2011 were used to develop an expanded exercise planning model that is HSEEP compliant but accounts for increased exercise complexity and is more functional for public health. The Expanded HSEEP (E-HSEEP) model was developed through changes in the HSEEP exercise planning process in areas of Exercise Plan, Controller/Evaluator Handbook, Evaluation Plan, and After Action Report and Improvement Plan development. The E-HSEEP model was tested and refined during the planning and evaluation of Mississippi's State-level Emergency Support Function-8 exercises in 2012 and 2013. As a result of using the E-HSEEP model, Mississippi State Department of Health was able to capture strengths, lessons learned, and areas for improvement, and identify microlevel issues that may have been missed using the traditional HSEEP framework. The South Central Preparedness and Emergency Response Learning Center is working to create an Excel-based E-HSEEP tool that will allow practice partners to build a database to track corrective actions and conduct many different types of analyses and comparisons.
NASA Astrophysics Data System (ADS)
Kalatzis, Fanis G.; Papageorgiou, Dimitrios G.; Demetropoulos, Ioannis N.
2006-09-01
The Merlin/MCL optimization environment and the GAMESS-US package were combined so as to offer an extended and efficient quantum chemistry optimization system, capable of implementing complex optimization strategies for generic molecular modeling problems. A communication and data exchange interface was established between the two packages exploiting all Merlin features such as multiple optimizers, box constraints, user extensions and a high level programming language. An important feature of the interface is its ability to perform dimer computations by eliminating the basis set superposition error using the counterpoise (CP) method of Boys and Bernardi. Furthermore it offers CP-corrected geometry optimizations using analytic derivatives. The unified optimization environment was applied to construct portions of the intermolecular potential energy surface of the weakly bound H-bonded complex C 6H 6-H 2O by utilizing the high level Merlin Control Language. The H-bonded dimer HF-H 2O was also studied by CP-corrected geometry optimization. The ab initio electronic structure energies were calculated using the 6-31G ** basis set at the Restricted Hartree-Fock and second-order Moller-Plesset levels, while all geometry optimizations were carried out using a quasi-Newton algorithm provided by Merlin. Program summaryTitle of program: MERGAM Catalogue identifier:ADYB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADYB_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer for which the program is designed and others on which it has been tested: The program is designed for machines running the UNIX operating system. It has been tested on the following architectures: IA32 (Linux with gcc/g77 v.3.2.3), AMD64 (Linux with the Portland group compilers v.6.0), SUN64 (SunOS 5.8 with the Sun Workshop compilers v.5.2) and SGI64 (IRIX 6.5 with the MIPSpro compilers v.7.4) Installations: University of Ioannina, Greece Operating systems or monitors under which the program has been tested: UNIX Programming language used: ANSI C, ANSI Fortran-77 No. of lines in distributed program, including test data, etc.:11 282 No. of bytes in distributed program, including test data, etc.: 49 458 Distribution format: tar.gz Memory required to execute with typical data: Memory requirements mainly depend on the selection of a GAMESS-US basis set and the number of atoms No. of bits in a word: 32 No. of processors used: 1 Has the code been vectorized or parallelized?: no Nature of physical problem: Multidimensional geometry optimization is of great importance in any ab initio calculation since it usually is one of the most CPU-intensive tasks, especially on large molecular systems. For example, the geometric and energetic description of van der Waals and weakly bound H-bonded complexes requires the construction of related important portions of the multidimensional intermolecular potential energy surface (IPES). So the various held views about the nature of these bonds can be quantitatively tested. Method of solution: The Merlin/MCL optimization environment was interconnected with the GAMESS-US package to facilitate geometry optimization in quantum chemistry problems. The important portions of the IPES require the capability to program optimization strategies. The Merlin/MCL environment was used for the implementation of such strategies. In this work, a CP-corrected geometry optimization was performed on the HF-H 2O complex and an MCL program was developed to study portions of the potential energy surface of the C 6H 6-H 2O complex. Restrictions on the complexity of the problem: The Merlin optimization environment and the GAMESS-US package must be installed. The MERGAM interface requires GAMESS-US input files that have been constructed in Cartesian coordinates. This restriction occurs from a design-time requirement to not allow reorientation of atomic coordinates; this rule holds always true when applying the COORD = UNIQUE keyword in a GAMESS-US input file. Typical running time: It depends on the size of the molecular system, the size of the basis set and the method of electron correlation. Execution of the test run took approximately 5 min on a 2.8 GHz Intel Pentium CPU.
Automation of multi-agent control for complex dynamic systems in heterogeneous computational network
NASA Astrophysics Data System (ADS)
Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan
2017-01-01
The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.
A neural network approach to job-shop scheduling.
Zhou, D N; Cherkassky, V; Baldwin, T R; Olson, D E
1991-01-01
A novel analog computational network is presented for solving NP-complete constraint satisfaction problems, i.e. job-shop scheduling. In contrast to most neural approaches to combinatorial optimization based on quadratic energy cost function, the authors propose to use linear cost functions. As a result, the network complexity (number of neurons and the number of resistive interconnections) grows only linearly with problem size, and large-scale implementations become possible. The proposed approach is related to the linear programming network described by D.W. Tank and J.J. Hopfield (1985), which also uses a linear cost function for a simple optimization problem. It is shown how to map a difficult constraint-satisfaction problem onto a simple neural net in which the number of neural processors equals the number of subjobs (operations) and the number of interconnections grows linearly with the total number of operations. Simulations show that the authors' approach produces better solutions than existing neural approaches to job-shop scheduling, i.e. the traveling salesman problem-type Hopfield approach and integer linear programming approach of J.P.S. Foo and Y. Takefuji (1988), in terms of the quality of the solution and the network complexity.
Bosse, Stefan
2015-01-01
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques. PMID:25690550
Asplund-Samuelsson, Johannes; Bergman, Birgitta; Larsson, John
2012-01-01
Caspases accomplish initiation and execution of apoptosis, a programmed cell death process specific to metazoans. The existence of prokaryotic caspase homologs, termed metacaspases, has been known for slightly more than a decade. Despite their potential connection to the evolution of programmed cell death in eukaryotes, the phylogenetic distribution and functions of these prokaryotic metacaspase sequences are largely uncharted, while a few experiments imply involvement in programmed cell death. Aiming at providing a more detailed picture of prokaryotic caspase homologs, we applied a computational approach based on Hidden Markov Model search profiles to identify and functionally characterize putative metacaspases in bacterial and archaeal genomes. Out of the total of 1463 analyzed genomes, merely 267 (18%) were identified to contain putative metacaspases, but their taxonomic distribution included most prokaryotic phyla and a few archaea (Euryarchaeota). Metacaspases were particularly abundant in Alphaproteobacteria, Deltaproteobacteria and Cyanobacteria, which harbor many morphologically and developmentally complex organisms, and a distinct correlation was found between abundance and phenotypic complexity in Cyanobacteria. Notably, Bacillus subtilis and Escherichia coli, known to undergo genetically regulated autolysis, lacked metacaspases. Pfam domain architecture analysis combined with operon identification revealed rich and varied configurations among the metacaspase sequences. These imply roles in programmed cell death, but also e.g. in signaling, various enzymatic activities and protein modification. Together our data show a wide and scattered distribution of caspase homologs in prokaryotes with structurally and functionally diverse sub-groups, and with a potentially intriguing evolutionary role. These features will help delineate future characterizations of death pathways in prokaryotes. PMID:23185476
Bosse, Stefan
2015-02-16
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.
Physiological and comparative evidence fails to confirm an adaptive role for aging in evolution.
Cohen, Alan A
2015-01-01
The longstanding debate about whether aging may have evolved for some adaptive reason is generally considered to pit evolutionary theory against empirical observations consistent with aging as a programmed aspect of organismal biology, in particular conserved aging genes. Here I argue that the empirical evidence on aging mechanisms does not support a view of aging as a programmed phenomenon, but rather supports a view of aging as the dysregulation of complex networks that maintain organismal homeostasis. The appearance of programming is due largely to the inadvertent activation of existing pathways during the process of dysregulation. It is argued that aging differs markedly from known programmed biological phenomena such as apoptosis in that it is (a) very heterogeneous in how it proceeds, and (b) much slower than it would need to be. Furthermore, the taxonomic distribution of aging across species does not support any proposed adaptive theories of aging, which would predict that aging rate would vary on a finer taxonomic scale depending on factors such as population density. Thus, while there are problems with the longstanding non-adaptive paradigm, current evidence does not support the notion that aging is programmed or that it may have evolved for adaptive reasons.
Atchison, Michael L
2009-01-01
There is a nationwide shortage of veterinarian-scientists in the United States. Barriers to recruiting veterinary students into research careers need to be identified, and mechanisms devised to reduce these barriers. Barriers to attracting veterinary students into research careers include ignorance of available research careers and of the training opportunities. Once admitted, students in research training programs often feel isolated, fitting into neither the veterinary environment nor the research environment. To address the above issues, it is necessary to advertise and educate the public about opportunities for veterinarian-scientists. Schools need to develop high-quality training programs that are well structured but retain appropriate flexibility. Sufficient resources are needed to operate these programs so that students do not graduate with significant debt. A community of veterinarian-scientists needs to be developed so that students do not feel isolated but, rather, are part of a large community of like-minded individuals. Because of the complexities of programs that train veterinarian-scientists, it is necessary to provide extensive advising and for faculty to develop a proactive, servant-leadership attitude. Finally, students must be made aware of career options after graduation.
Manycore Performance-Portability: Kokkos Multidimensional Array Library
Edwards, H. Carter; Sunderland, Daniel; Porter, Vicki; ...
2012-01-01
Large, complex scientific and engineering application code have a significant investment in computational kernels to implement their mathematical models. Porting these computational kernels to the collection of modern manycore accelerator devices is a major challenge in that these devices have diverse programming models, application programming interfaces (APIs), and performance requirements. The Kokkos Array programming model provides library-based approach to implement computational kernels that are performance-portable to CPU-multicore and GPGPU accelerator devices. This programming model is based upon three fundamental concepts: (1) manycore compute devices each with its own memory space, (2) data parallel kernels and (3) multidimensional arrays. Kernel executionmore » performance is, especially for NVIDIA® devices, extremely dependent on data access patterns. Optimal data access pattern can be different for different manycore devices – potentially leading to different implementations of computational kernels specialized for different devices. The Kokkos Array programming model supports performance-portable kernels by (1) separating data access patterns from computational kernels through a multidimensional array API and (2) introduce device-specific data access mappings when a kernel is compiled. An implementation of Kokkos Array is available through Trilinos [Trilinos website, http://trilinos.sandia.gov/, August 2011].« less
CORSS: Cylinder Optimization of Rings, Skin, and Stringers
NASA Technical Reports Server (NTRS)
Finckenor, J.; Rogers, P.; Otte, N.
1994-01-01
Launch vehicle designs typically make extensive use of cylindrical skin stringer construction. Structural analysis methods are well developed for preliminary design of this type of construction. This report describes an automated, iterative method to obtain a minimum weight preliminary design. Structural optimization has been researched extensively, and various programs have been written for this purpose. Their complexity and ease of use depends on their generality, the failure modes considered, the methodology used, and the rigor of the analysis performed. This computer program employs closed-form solutions from a variety of well-known structural analysis references and joins them with a commercially available numerical optimizer called the 'Design Optimization Tool' (DOT). Any ring and stringer stiffened shell structure of isotropic materials that has beam type loading can be analyzed. Plasticity effects are not included. It performs a more limited analysis than programs such as PANDA, but it provides an easy and useful preliminary design tool for a large class of structures. This report briefly describes the optimization theory, outlines the development and use of the program, and describes the analysis techniques that are used. Examples of program input and output, as well as the listing of the analysis routines, are included.
Program helps quickly calculate deviated well path
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gardner, M.P.
1993-11-22
A BASIC computer program quickly calculates the angle and measured depth of a simple directional well given only the true vertical depth and total displacement of the target. Many petroleum engineers and geologists need a quick, easy method to calculate the angle and measured depth necessary to reach a target in a proposed deviated well bore. Too many of the existing programs are large and require much input data. The drilling literature is full of equations and methods to calculate the course of well paths from surveys taken after a well is drilled. Very little information, however, covers how tomore » calculate well bore trajectories for proposed wells from limited data. Furthermore, many of the equations are quite complex and difficult to use. A figure lists a computer program with the equations to calculate the well bore trajectory necessary to reach a given displacement and true vertical depth (TVD) for a simple build plant. It can be run on an IBM compatible computer with MS-DOS version 5 or higher, QBasic, or any BASIC that does no require line numbers. QBasic 4.5 compiler will also run the program. The equations are based on conventional geometry and trigonometry.« less
Situational Analysis for Complex Systems: Methodological Development in Public Health Research.
Martin, Wanda; Pauly, Bernie; MacDonald, Marjorie
2016-01-01
Public health systems have suffered infrastructure losses worldwide. Strengthening public health systems requires not only good policies and programs, but also development of new research methodologies to support public health systems renewal. Our research team considers public health systems to be complex adaptive systems and as such new methods are necessary to generate knowledge about the process of implementing public health programs and services. Within our program of research, we have employed situational analysis as a method for studying complex adaptive systems in four distinct research studies on public health program implementation. The purpose of this paper is to demonstrate the use of situational analysis as a method for studying complex systems and highlight the need for further methodological development.
Barrett, R. F.; Crozier, P. S.; Doerfler, D. W.; ...
2014-09-28
Computational science and engineering application programs are typically large, complex, and dynamic, and are often constrained by distribution limitations. As a means of making tractable rapid explorations of scientific and engineering application programs in the context of new, emerging, and future computing architectures, a suite of miniapps has been created to serve as proxies for full scale applications. Each miniapp is designed to represent a key performance characteristic that does or is expected to significantly impact the runtime performance of an application program. In this paper we introduce a methodology for assessing the ability of these miniapps to effectively representmore » these performance issues. We applied this methodology to four miniapps, examining the linkage between them and an application they are intended to represent. Herein we evaluate the fidelity of that linkage. This work represents the initial steps required to begin to answer the question, ''Under what conditions does a miniapp represent a key performance characteristic in a full app?''« less
Security and Cloud Outsourcing Framework for Economic Dispatch
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarker, Mushfiqur R.; Wang, Jianhui; Li, Zuyi
The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for themore » Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.« less
Security and Cloud Outsourcing Framework for Economic Dispatch
Sarker, Mushfiqur R.; Wang, Jianhui; Li, Zuyi; ...
2017-04-24
The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for themore » Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.« less
Survey of computer programs for prediction of crash response and of its experimental validation
NASA Technical Reports Server (NTRS)
Kamat, M. P.
1976-01-01
The author seeks to critically assess the potentialities of the mathematical and hybrid simulators which predict post-impact response of transportation vehicles. A strict rigorous numerical analysis of a complex phenomenon like crash may leave a lot to be desired with regard to the fidelity of mathematical simulation. Hybrid simulations on the other hand which exploit experimentally observed features of deformations appear to hold a lot of promise. MARC, ANSYS, NONSAP, DYCAST, ACTION, WHAM II and KRASH are among some of the simulators examined for their capabilities with regard to prediction of post impact response of vehicles. A review of these simulators reveals that much more by way of an analysis capability may be desirable than what is currently available. NASA's crashworthiness testing program in conjunction with similar programs of various other agencies, besides generating a large data base, will be equally useful in the validation of new mathematical concepts of nonlinear analysis and in the successful extension of other techniques in crashworthiness.
Green, Dale E; Hamory, Bruce H; Terrell, Grace E; O'Connell, Jasmine
2017-08-01
Over the course of a single year, Cornerstone Health Care, a multispecialty group practice in North Carolina, redesigned the underlying care models for 5 of its highest-risk populations-late-stage congestive heart failure, oncology, Medicare-Medicaid dual eligibles, those with 5 or more chronic conditions, and the most complex patients with multiple late-stage chronic conditions. At the 1-year mark, the results of the program were analyzed. Overall costs for the patients studied were reduced by 12.7% compared to the year before enrollment. All fully implemented programs delivered between 10% and 16% cost savings. The key area for savings factor was hospitalization, which was reduced by 30% across all programs. The greatest area of cost increase was "other," a category that consisted in large part of hospice services. Full implementation was key; 2 primary care sites that reverted to more traditional models failed to show the same pattern of savings.
An alternative splicing program promotes adipose tissue thermogenesis
Vernia, Santiago; Edwards, Yvonne JK; Han, Myoung Sook; Cavanagh-Kyros, Julie; Barrett, Tamera; Kim, Jason K; Davis, Roger J
2016-01-01
Alternative pre-mRNA splicing expands the complexity of the transcriptome and controls isoform-specific gene expression. Whether alternative splicing contributes to metabolic regulation is largely unknown. Here we investigated the contribution of alternative splicing to the development of diet-induced obesity. We found that obesity-induced changes in adipocyte gene expression include alternative pre-mRNA splicing. Bioinformatics analysis associated part of this alternative splicing program with sequence specific NOVA splicing factors. This conclusion was confirmed by studies of mice with NOVA deficiency in adipocytes. Phenotypic analysis of the NOVA-deficient mice demonstrated increased adipose tissue thermogenesis and improved glycemia. We show that NOVA proteins mediate a splicing program that suppresses adipose tissue thermogenesis. Together, these data provide quantitative analysis of gene expression at exon-level resolution in obesity and identify a novel mechanism that contributes to the regulation of adipose tissue function and the maintenance of normal glycemia. DOI: http://dx.doi.org/10.7554/eLife.17672.001 PMID:27635635
NASA Technical Reports Server (NTRS)
Bolin, B.
1984-01-01
The global biosphere is an exceedingly complex system. To gain an understanding of its structure and dynamic features, it is necessary to increase knowledge about the detailed processes, but also to develop models of how global interactions take place. Attempts to analyze the detailed physical, chemical and biological processes need, in this context, to be guided by an advancement of understanding of the latter. It is necessary to develop a strategy of data gathering that serves both these purposes simultaneously. climate research during the last decade may serve as a useful example of how to approach this difficult problem in a systematic way. Large programs for data collection may easily become rigid and costly. While realizing the necessity of a systematic and long lasting effort of observing the atmosphere, the oceans, land and life on Earth, such a program must remain flexible enough to permit the modifications and even sometimes improvisations that are necessary to maintain a viable program.
Technology Development Activities for the Space Environment and its Effects on Spacecraft
NASA Technical Reports Server (NTRS)
Kauffman, Billy; Hardage, Donna; Minor, Jody; Barth, Janet; LaBel, Ken
2003-01-01
Reducing size and weight of spacecraft, along with demanding increased performance capabilities, introduces many uncertainties in the engineering design community on how emerging microelectronics will perform in space. The engineering design community is forever behind on obtaining and developing new tools and guidelines to mitigate the harmful effects of the space environment. Adding to this complexity is the push to use Commercial-off-the-shelf (COTS) and shrinking microelectronics behind less shielding and the potential usage of unproven technologies such as large solar sail structures and nuclear electric propulsion. In order to drive down these uncertainties, various programs are working together to avoid duplication, save what resources are available in this technical area and possess a focused agenda to insert these new developments into future mission designs. This paper will describe the relationship between the Living With a Star (LWS): Space Environment Testbeds (SET) Project and NASA's Space Environments and Effects (SEE) Program and their technology development activities funded as a result from the recent SEE Program's NASA Research Announcement.
Storch, Tatiane Timm; Finatto, Taciane; Bruneau, Maryline; Orsel-Baldwin, Mathilde; Renou, Jean-Pierre; Rombaldi, Cesar Valmor; Quecini, Vera; Laurens, François; Girardi, César Luis
2017-09-06
Apple is commercially important worldwide. Favorable genomic contexts and postharvest technologies allow year-round availability. Although ripening is considered a unidirectional developmental process toward senescence, storage at low temperatures, alone or in combination with ethylene blockage, is effective in preserving apple properties. Quality traits and genome wide expression were integrated to investigate the mechanisms underlying postharvest changes. Development and conservation techniques were responsible for transcriptional reprogramming and distinct programs associated with quality traits. A large portion of the differentially regulated genes constitutes a program involved in ripening and senescence, whereas a smaller module consists of genes associated with reestablishment and maintenance of juvenile traits after harvest. Ethylene inhibition was associated with a reversal of ripening by transcriptional induction of anabolic pathways. Our results demonstrate that the blockage of ethylene perception and signaling leads to upregulation of genes in anabolic pathways. We also associated complex phenotypes to subsets of differentially regulated genes.
Liquid fluorine/hydrazine rhenium thruster update
NASA Technical Reports Server (NTRS)
Appel, M. A.; Kaplan, R. B.; Tuffias, R. H.
1983-01-01
The status of a fluorine/hydrazine thruster development program is discussed. A solid rhenium metal sea-level thrust chamber was successfully fabricated and tested for a total run duration of 1075 s with 17 starts. Rhenium fabrication methods are discussed. A test program was conducted to evaluate performance and chamber cooling. Acceptable performance was reached and cooling was adequate. A flight-type injector was fabricated that achieved an average extrapolated performance value of 3608 N-s/kg (368 lbf-s/lbm). Altitude thrust chambers were fabricated. One chamber incorporates a rhenium combustor and nozzle with an area ratio of 15:1, and a columbium nozzle extension with area ratios from 15:1 to 60:1. The other chamber was fabricated completely with a carbon/carbon composite. Because of the attributes of rhenium for use in high-temperature applications, a program to provide the materials and processes technology needed to reliably fabricate and/or repair vapor-deposited rhenium parts of relatively large size and complex shape is recommended.
[Documenting a rehabilitation program using a logic model: an advantage to the assessment process].
Poncet, Frédérique; Swaine, Bonnie; Pradat-Diehl, Pascale
2017-03-06
The cognitive and behavioral disorders after brain injury can result in severe limitations of activities and restrictions of participation. An interdisciplinary rehabilitation program was developed in physical medicine and rehabilitation at the Pitié-Salpêtriere Hospital, Paris, France. Clinicians believe this program decreases activity limitations and improves participation in patients. However, the program’s effectiveness had never been assessed. To do this, we had to define/describe this program. However rehabilitation programs are holistic and thus complex making them difficult to describe. Therefore, to facilitate the evaluation of complex programs, including those for rehabilitation, we illustrate the use of a theoretical logic model, as proposed by Champagne, through the process of documentation of a specific complex and interdisciplinary rehabilitation program. Through participatory/collaborative research, the rehabilitation program was analyzed using three “submodels” of the logic model of intervention: causal model, intervention model and program theory model. This should facilitate the evaluation of programs, including those for rehabilitation.
NASA Astrophysics Data System (ADS)
Cho, Minjeong; Lee, Jungil; Choi, Haecheon
2012-11-01
The mean wall shear stress boundary condition was successfully applied to turbulent channel and boundary flows using large eddy simulation without resolving near-wall region (see Lee, Cho & Choi in this book of abstracts). In the present study, we apply this boundary condition to more complex flows where flow separation and redeveloping flow exist. As a test problem, we consider flow over a backward-facing step at Reh = 22860 based on the step height. Turbulent boundary layer flow at the inlet (Reθ = 1050) is obtained using inflow generation technique by Lund et al. (1998) but with wall shear stress boundary condition. First, we prescribe the mean wall shear stress distribution obtained from DNS (Kim, 2011, Ph.D. Thesis, Stanford U.) as the boundary condition of present simulation. Here we give no-slip boundary condition at flow-reversal region. The present results are in good agreements with the flow statistics by DNS. Currently, a dynamic approach of obtaining mean wall shear stress based on the log-law is being applied to the flow having flow separation and its results will be shown in the presentation. Supported by the WCU and NRF programs.
Clinothem Lobe Growth and Possible Ties to Downslope Processes in the Gulf of Papua
NASA Astrophysics Data System (ADS)
Wei, E. A. Y.; Driscoll, N. W.; Milliman, J. D.; Slingerland, R. L.
2014-12-01
The Gulf of Papua is fed by the large-floodplain Fly River and small mountainous rivers to the north, thus creating an ideal environment where end-member cases of river systems and their deltas (e.g. the large-floodplain Brazos River and the narrow-shelved Eel River) can be studied. Input from five rivers into the gulf has constructed a three-dimensional mid-shelf clinothem composed of three depositional lobes, with a central lobe downlapped by two younger lobes to the north and south. This geometry suggests that the three lobes are not syndepositional but rather that clinoform depocenters have shifted 60 km, thus bypassing adjacent accommodation. Newly examined CHIRP (Compressed High Intensity Radar Pulse) seismic lines and XRF analysis of piston cores from the 2004 NSF MARGINS program reveal distinct lobes offshore that exhibit increased complexity moving shoreward. Evidence of shoreward complexity and lobe interfingering cause us to question the originally proposed mechanism for depocenter shift involving circulation changes. An alternative hypothesis that stems from distinct lobe architecture farther offshore suggests that channelized downslope processes and nearshore storage may play important roles in lobe growth.
Mylona, Anastasia; Carr, Stephen; Aller, Pierre; Moraes, Isabel; Treisman, Richard; Evans, Gwyndaf; Foadi, James
2017-08-04
The present article describes how to use the computer program BLEND to help assemble complete datasets for the solution of macromolecular structures, starting from partial or complete datasets, derived from data collection from multiple crystals. The program is demonstrated on more than two hundred X-ray diffraction datasets obtained from 50 crystals of a complex formed between the SRF transcription factor, its cognate DNA, and a peptide from the SRF cofactor MRTF-A. This structure is currently in the process of being fully solved. While full details of the structure are not yet available, the repeated application of BLEND on data from this structure, as they have become available, has made it possible to produce electron density maps clear enough to visualise the potential location of MRTF sequences.
Integrated Data Modeling and Simulation on the Joint Polar Satellite System Program
NASA Technical Reports Server (NTRS)
Roberts, Christopher J.; Boyce, Leslye; Smith, Gary; Li, Angela; Barrett, Larry
2012-01-01
The Joint Polar Satellite System is a modern, large-scale, complex, multi-mission aerospace program, and presents a variety of design, testing and operational challenges due to: (1) System Scope: multi-mission coordination, role, responsibility and accountability challenges stemming from porous/ill-defined system and organizational boundaries (including foreign policy interactions) (2) Degree of Concurrency: design, implementation, integration, verification and operation occurring simultaneously, at multiple scales in the system hierarchy (3) Multi-Decadal Lifecycle: technical obsolesce, reliability and sustainment concerns, including those related to organizational and industrial base. Additionally, these systems tend to become embedded in the broader societal infrastructure, resulting in new system stakeholders with perhaps different preferences (4) Barriers to Effective Communications: process and cultural issues that emerge due to geographic dispersion and as one spans boundaries including gov./contractor, NASA/Other USG, and international relationships.
NASA Astrophysics Data System (ADS)
Yang, Yuchen; Mabu, Shingo; Shimada, Kaoru; Hirasawa, Kotaro
Intertransaction association rules have been reported to be useful in many fields such as stock market prediction, but still there are not so many efficient methods to dig them out from large data sets. Furthermore, how to use and measure these more complex rules should be considered carefully. In this paper, we propose a new intertransaction class association rule mining method based on Genetic Network Programming (GNP), which has the ability to overcome some shortages of Apriori-like based intertransaction association methods. Moreover, a general classifier model for intertransaction rules is also introduced. In experiments on the real world application of stock market prediction, the method shows its efficiency and ability to obtain good results and can bring more benefits with a suitable classifier considering larger interval span.
Simulating New Drop Test Vehicles and Test Techniques for the Orion CEV Parachute Assembly System
NASA Technical Reports Server (NTRS)
Morris, Aaron L.; Fraire, Usbaldo, Jr.; Bledsoe, Kristin J.; Ray, Eric; Moore, Jim W.; Olson, Leah M.
2011-01-01
The Crew Exploration Vehicle Parachute Assembly System (CPAS) project is engaged in a multi-year design and test campaign to qualify a parachute recovery system for human use on the Orion Spacecraft. Test and simulation techniques have evolved concurrently to keep up with the demands of a challenging and complex system. The primary simulations used for preflight predictions and post-test data reconstructions are Decelerator System Simulation (DSS), Decelerator System Simulation Application (DSSA), and Drop Test Vehicle Simulation (DTV-SIM). The goal of this paper is to provide a roadmap to future programs on the test technique challenges and obstacles involved in executing a large-scale, multi-year parachute test program. A focus on flight simulation modeling and correlation to test techniques executed to obtain parachute performance parameters are presented.
NASA Technical Reports Server (NTRS)
Selkirk, Henry B.
1996-01-01
This report reviews the second year of a three-year research program to investigate the physical mechanisms which underlie the transport of trace constituents in the stratosphere- troposphere system. The primary scientific goal of the research is to identify the processes which transport air masses within the lower stratosphere, particularly between the tropics and middle latitudes. The SASS program seeks to understand the impact of the present and future fleets of conventional jet traffic on the upper troposphere and lower stratosphere, while complementary airborne observations under UARP seek to understand the complex interactions of dynamical and chemical processes that affect the ozone layer. The present investigation contributes to the goals of each of these by diagnosing the history of air parcels intercepted by NASA research aircraft in UARP and AEAP campaigns.
Integrating Computational Science Tools into a Thermodynamics Course
NASA Astrophysics Data System (ADS)
Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew
2018-01-01
Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.
The mathematical statement for the solving of the problem of N-version software system design
NASA Astrophysics Data System (ADS)
Kovalev, I. V.; Kovalev, D. I.; Zelenkov, P. V.; Voroshilova, A. A.
2015-10-01
The N-version programming, as a methodology of the fault-tolerant software systems design, allows successful solving of the mentioned tasks. The use of N-version programming approach turns out to be effective, since the system is constructed out of several parallel executed versions of some software module. Those versions are written to meet the same specification but by different programmers. The problem of developing an optimal structure of N-version software system presents a kind of very complex optimization problem. This causes the use of deterministic optimization methods inappropriate for solving the stated problem. In this view, exploiting heuristic strategies looks more rational. In the field of pseudo-Boolean optimization theory, the so called method of varied probabilities (MVP) has been developed to solve problems with a large dimensionality.
Mylona, Anastasia; Carr, Stephen; Aller, Pierre; Moraes, Isabel; Treisman, Richard; Evans, Gwyndaf; Foadi, James
2018-01-01
The present article describes how to use the computer program BLEND to help assemble complete datasets for the solution of macromolecular structures, starting from partial or complete datasets, derived from data collection from multiple crystals. The program is demonstrated on more than two hundred X-ray diffraction datasets obtained from 50 crystals of a complex formed between the SRF transcription factor, its cognate DNA, and a peptide from the SRF cofactor MRTF-A. This structure is currently in the process of being fully solved. While full details of the structure are not yet available, the repeated application of BLEND on data from this structure, as they have become available, has made it possible to produce electron density maps clear enough to visualise the potential location of MRTF sequences. PMID:29456874
ROOT — A C++ framework for petabyte data storage, statistical analysis and visualization
NASA Astrophysics Data System (ADS)
Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Canal, Ph.; Casadei, D.; Couet, O.; Fine, V.; Franco, L.; Ganis, G.; Gheata, A.; Maline, D. Gonzalez; Goto, M.; Iwaszkiewicz, J.; Kreshuk, A.; Segura, D. Marcos; Maunder, R.; Moneta, L.; Naumann, A.; Offermann, E.; Onuchin, V.; Panacek, S.; Rademakers, F.; Russo, P.; Tadel, M.
2009-12-01
ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks — e.g. data mining in HEP — by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way. Program summaryProgram title: ROOT Catalogue identifier: AEFA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: LGPL No. of lines in distributed program, including test data, etc.: 3 044 581 No. of bytes in distributed program, including test data, etc.: 36 325 133 Distribution format: tar.gz Programming language: C++ Computer: Intel i386, Intel x86-64, Motorola PPC, Sun Sparc, HP PA-RISC Operating system: GNU/Linux, Windows XP/Vista, Mac OS X, FreeBSD, OpenBSD, Solaris, HP-UX, AIX Has the code been vectorized or parallelized?: Yes RAM:>55 Mbytes Classification: 4, 9, 11.9, 14 Nature of problem: Storage, analysis and visualization of scientific data Solution method: Object store, wide range of analysis algorithms and visualization methods Additional comments: For an up-to-date author list see: http://root.cern.ch/drupal/content/root-development-team and http://root.cern.ch/drupal/content/former-root-developers Running time: Depending on the data size and complexity of analysis algorithms References:http://root.cern.ch.
NWChem: A comprehensive and scalable open-source solution for large scale molecular simulations
NASA Astrophysics Data System (ADS)
Valiev, M.; Bylaska, E. J.; Govind, N.; Kowalski, K.; Straatsma, T. P.; Van Dam, H. J. J.; Wang, D.; Nieplocha, J.; Apra, E.; Windus, T. L.; de Jong, W. A.
2010-09-01
The latest release of NWChem delivers an open-source computational chemistry package with extensive capabilities for large scale simulations of chemical and biological systems. Utilizing a common computational framework, diverse theoretical descriptions can be used to provide the best solution for a given scientific problem. Scalable parallel implementations and modular software design enable efficient utilization of current computational architectures. This paper provides an overview of NWChem focusing primarily on the core theoretical modules provided by the code and their parallel performance. Program summaryProgram title: NWChem Catalogue identifier: AEGI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Open Source Educational Community License No. of lines in distributed program, including test data, etc.: 11 709 543 No. of bytes in distributed program, including test data, etc.: 680 696 106 Distribution format: tar.gz Programming language: Fortran 77, C Computer: all Linux based workstations and parallel supercomputers, Windows and Apple machines Operating system: Linux, OS X, Windows Has the code been vectorised or parallelized?: Code is parallelized Classification: 2.1, 2.2, 3, 7.3, 7.7, 16.1, 16.2, 16.3, 16.10, 16.13 Nature of problem: Large-scale atomistic simulations of chemical and biological systems require efficient and reliable methods for ground and excited solutions of many-electron Hamiltonian, analysis of the potential energy surface, and dynamics. Solution method: Ground and excited solutions of many-electron Hamiltonian are obtained utilizing density-functional theory, many-body perturbation approach, and coupled cluster expansion. These solutions or a combination thereof with classical descriptions are then used to analyze potential energy surface and perform dynamical simulations. Additional comments: Full documentation is provided in the distribution file. This includes an INSTALL file giving details of how to build the package. A set of test runs is provided in the examples directory. The distribution file for this program is over 90 Mbytes and therefore is not delivered directly when download or Email is requested. Instead a html file giving details of how the program can be obtained is sent. Running time: Running time depends on the size of the chemical system, complexity of the method, number of cpu's and the computational task. It ranges from several seconds for serial DFT energy calculations on a few atoms to several hours for parallel coupled cluster energy calculations on tens of atoms or ab-initio molecular dynamics simulation on hundreds of atoms.
ALCF Data Science Program: Productive Data-centric Supercomputing
NASA Astrophysics Data System (ADS)
Romero, Nichols; Vishwanath, Venkatram
The ALCF Data Science Program (ADSP) is targeted at big data science problems that require leadership computing resources. The goal of the program is to explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines. The projects will focus on data science techniques covering a wide area of discovery including but not limited to uncertainty quantification, statistics, machine learning, deep learning, databases, pattern recognition, image processing, graph analytics, data mining, real-time data analysis, and complex and interactive workflows. Project teams will be among the first to access Theta, ALCFs forthcoming 8.5 petaflops Intel/Cray system. The program will transition to the 200 petaflop/s Aurora supercomputing system when it becomes available. In 2016, four projects have been selected to kick off the ADSP. The selected projects span experimental and computational sciences and range from modeling the brain to discovering new materials for solar-powered windows to simulating collision events at the Large Hadron Collider (LHC). The program will have a regular call for proposals with the next call expected in Spring 2017.http://www.alcf.anl.gov/alcf-data-science-program This research used resources of the ALCF, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.
Developing a Physician Management & Leadership Program (PMLP) in Newfoundland and Labrador.
Maddalena, Victor; Fleet, Lisa
2015-01-01
This article aims to document the process the province of Newfoundland and Labrador used to develop an innovative Physician Management and Leadership Program (PMLP). The PMLP is a collaborative initiative among Memorial University (Faculty of Medicine and Faculty of Business), the Government of Newfoundland and Labrador, and the Regional Health Authorities. As challenges facing health-care systems become more complex there is a growing need for management and leadership training for physicians. Memorial University Faculty of Medicine and the Gardiner Centre in the Faculty of Business in partnership with Regional Health Authorities and the Government of Newfoundland and Labrador identified the need for a leadership and management education program for physician leaders. A provincial needs assessment of physician leaders was conducted to identify educational needs to fill this identified gap. A Steering Committee was formed to guide the design and implementation and monitor delivery of the 10 module Physician Management and Leadership Program (PMLP). Designing management and leadership education programs to serve physicians who practice in a large, predominately rural geographic area can be challenging and requires efficient use of available resources and technology. While there are many physician management and leadership programs available in Canada and abroad, the PMLP was designed to meet the specific educational needs of physician leaders in Newfoundland and Labrador.
Current Status of Postdoctoral and Graduate Programs in Dentistry.
Assael, Leon
2017-08-01
Advanced dental education has evolved in the context of societal needs and economic trends to its current status. Graduate programs have positioned their role in the context of health systems and health science education trends in hospitals, interprofessional clinical care teams, and dental schools and oral health care systems. Graduate dental education has been a critical factor in developing teams in trauma care, craniofacial disorders, pediatric and adult medicine, and oncology. The misalignment of the mission of graduate dental programs and the demands of private practice has posed a challenge in the evolution of programs as educational programs have been directed towards tertiary and indigent care while the practice community focuses on largely healthy affluent patients for complex clinical interventions. Those seeking graduate dental education today are smaller in number and include more international dental graduates than in the past. Graduate dental education in general dentistry and in the nine recognized dental specialties now includes Commission on Dental Accreditation (CODA) recognition of training standards as part of its accreditation process and a CODA accreditation process for areas of clinical education not recognized as specialties by the American Dental Association. Current types of programs include fellowship training for students in recognized specialties. This article was written as part of the project "Advancing Dental Education in the 21 st Century."
Improving vocational rehabilitation services for injured workers in Washington State.
Sears, Jeanne M; Wickizer, Thomas M; Schulman, Beryl A
2014-06-01
Workers who incur permanent impairments or have ongoing medical restrictions due to injuries or illnesses sustained at work may require support from vocational rehabilitation programs in order to return to work. Vocational rehabilitation programs implemented within workers' compensation settings are costly, and effective service delivery has proven challenging. The Vocational Improvement Project, a 5.5-year pilot program beginning in 2008, introduced major changes to the Washington State workers' compensation-based vocational rehabilitation program. In the evaluation of this pilot program, set within a large complex system characterized by competing stakeholder interests, we assessed effects on system efficiency and employment outcomes for injured workers. While descriptive in nature, this evaluation provided evidence that several of the intended outcomes were attained, including: (1) fewer repeat referrals, (2) fewer delays, (3) increased choice for workers, and (4) establishment of statewide partnerships to improve worker outcomes. There remains substantial room for further improvement. Retraining plan completion rates remain under 60% and only half of workers earned any wages within two years of completing their retraining plan. Ongoing communication with stakeholders was critical to the successful conduct and policy impact of this evaluation, which culminated in a 3-year extension of the pilot program through June 2016. Copyright © 2013 Elsevier Ltd. All rights reserved.
Computer-aided programming for message-passing system; Problems and a solution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, M.Y.; Gajski, D.D.
1989-12-01
As the number of processors and the complexity of problems to be solved increase, programming multiprocessing systems becomes more difficult and error-prone. Program development tools are necessary since programmers are not able to develop complex parallel programs efficiently. Parallel models of computation, parallelization problems, and tools for computer-aided programming (CAP) are discussed. As an example, a CAP tool that performs scheduling and inserts communication primitives automatically is described. It also generates the performance estimates and other program quality measures to help programmers in improving their algorithms and programs.
NASA Astrophysics Data System (ADS)
Cécillon, Lauric; Quénéa, Katell; Anquetil, Christelle; Barré, Pierre
2015-04-01
Due to its large heterogeneity at all scales (from soil core to the globe), several measurements are often mandatory to get a meaningful value of a measured soil property. A large number of measurements can therefore be needed to study a soil property whatever the scale of the study. Moreover, several soil investigation techniques produce large and complex datasets, such as pyrolysis-gas chromatography-mass spectrometry (Py-GC-MS) which produces complex 3-way data. In this context, straightforward methods designed to speed up data treatments are needed to deal with large datasets. GC-MS pyrolysis (py-GCMS) is a powerful and frequently used tool to characterize soil organic matter (SOM). However, the treatment of the results of a py-GCMS analysis of soil sample is time consuming (number of peaks, co-elution, etc.) and the treatment of large data set of py-GCMS results is rather laborious. Moreover, peak position shifts and baseline drifts between analyses make the automation of GCMS programs data treatment difficult. These problems can be fixed using the Parallel Factor Analysis 2 (PARAFAC 2, Kiers et al., 1999; Bro et al., 1999). This algorithm has been applied frequently on chromatography data but has never been applied to analyses of SOM. We developed a Matlab routine based on existing Matlab packages dedicated to the simultaneous treatment of dozens of pyro-chromatograms mass spectra. We applied this routine on 40 soil samples. The benefits and expected improvements of our method will be discussed in our poster. References Kiers et al. (1999) PARAFAC2 - PartI. A direct fitting algorithm for the PARAFAC2 model. Journal of Chemometrics, 13: 275-294. Bro et al. (1999) PARAFAC2 - PartII. Modeling chromatographic data with retention time shifts. Journal of Chemometrics, 13: 295-309.
A multidisciplinary Earth science research program in China
NASA Astrophysics Data System (ADS)
Dong, Shuwen; Li, Tingdong; Gao, Rui; Hou, Hesheng; Li, Yingkang; Zhang, Shihong; Keller, G. Randy; Liu, Mian
2011-09-01
Because China occupies a large and geologically complex region of central and eastern Asia, the country may hold the keys to resolving many basic problems in the Earth sciences, such as how continental collision with India produced China's interconnected array of large intraplate structures, and what links exist between these structures and natural resources. To learn more, the Chinese government has launched SinoProbe, a major research initiative focusing on multidisciplinary imaging of the three-dimensional (3-D) structure and composition of the Chinese continental lithosphere and its evolution through geologic history. This effort is also motivated by China's need for a comprehensive and systematic evaluation of its natural resources and a better understanding of potential geohazards. SinoProbe is funded by the Chinese Ministry of Finance, managed by the Chinese Ministry of Land and Resources, and organized by the Chinese Academy of Geological Sciences. More than 960 investigators and engineers are currently involved with the program, not counting international collaborators. Most of them are affiliated with the Chinese Academy of Geological Sciences, the Chinese Academy of Sciences, the Ministry of Education (i.e., universities), and the China Earthquake Administration. The initial phase of the program (2008-2012), with funding equivalent to about US$164 million, is testing the feasibility of new technologies in geophysical and geochemical exploration and deep continental drilling by focusing on a series of profiles (Figure 1).
Verification and Validation for Flight-Critical Systems (VVFCS)
NASA Technical Reports Server (NTRS)
Graves, Sharon S.; Jacobsen, Robert A.
2010-01-01
On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).
Evaluating Innovations in Home Care for Performance Accountability.
Collister, Barbara; Gutscher, Abram; Ambrogiano, Jana
2016-01-01
Concerns about rising costs and the sustainability of our healthcare system have led to a drive for innovative solutions and accountability for performance. Integrated Home Care, Calgary Zone, Alberta Health Services went beyond traditional accountability measures to use evaluation methodology to measure the progress of complex innovations to its organization structure and service delivery model. This paper focuses on the first two phases of a three-phase evaluation. The results of the first two phases generated learning about innovation adoption and sustainability, and performance accountability at the program-level of a large publicly funded healthcare organization.
Pre-sporulation stages of Streptomyces differentiation: state-of-the-art and future perspectives.
Yagüe, Paula; López-García, Maria T; Rioseras, Beatriz; Sánchez, Jesús; Manteca, Angel
2013-05-01
Streptomycetes comprise very important industrial bacteria, producing two-thirds of all clinically relevant secondary metabolites. They are mycelial microorganisms with complex developmental cycles that include programmed cell death (PCD) and sporulation. Industrial fermentations are usually performed in liquid cultures (large bioreactors), conditions in which Streptomyces strains generally do not sporulate, and it was traditionally assumed that there was no differentiation. In this work, we review the current knowledge on Streptomyces pre-sporulation stages of Streptomyces differentiation. © 2013 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.
VISIONS - Vista Star Formation Atlas
NASA Astrophysics Data System (ADS)
Meingast, Stefan; Alves, J.; Boui, H.; Ascenso, J.
2017-06-01
In this talk I will present the new ESO public survey VISIONS. Starting in early 2017 we will use the ESO VISTA survey telescope in a 550 h long programme to map the largest molecular cloud complexes within 500 pc in a multi-epoch program. The survey is optimized for measuring the proper motions of young stellar objects invisible to Gaia and mapping the cloud-structure with extinction. VISIONS will address a series of ISM topics ranging from the connection of dense cores to YSOs and the dynamical evolution of embedded clusters to variations in the reddening law on both small and large scales.
1981-05-01
be allocated to targets on the battlefield and in the rear area. The speaker describes the VECTOR I/NUCLEAR model, a combination of the UNICORN target...outlined. UNICORN is compatible with VECTOR 1 in level of detail. It is an expected value damage model and uses linear programming to optimize the...and a growing appreciation for the power of simulation in addressing large, complex problems, it was only a few short years before these games had
African Swine Fever Virus Gets Undressed: New Insights on the Entry Pathway.
Andrés, Germán
2017-02-15
African swine fever virus (ASFV) is a large, multienveloped DNA virus composed of a genome-containing core successively wrapped by an inner lipid envelope, an icosahedral protein capsid, and an outer lipid envelope. In keeping with this structural complexity, recent studies have revealed an intricate entry program. This Gem highlights how ASFV uses two alternative pathways, macropinocytosis and clathrin-mediated endocytosis, to enter into the host macrophage and how the endocytosed particles undergo a stepwise, low pH-driven disassembly leading to inner envelope fusion and core delivery in the cytoplasm. Copyright © 2017 American Society for Microbiology.
U.S. Nuclear Weapons Modernization - the Stockpile Life Extension Program
NASA Astrophysics Data System (ADS)
Cook, Donald
2016-03-01
Underground nuclear testing of U.S. nuclear weapons was halted by President George H.W. Bush in 1992 when he announced a moratorium. In 1993, the moratorium was extended by President Bill Clinton and, in 1995, a program of Stockpile Stewardship was put in its place. In 1996, President Clinton signed the Comprehensive Nuclear Test Ban Treaty (CTBT). Twenty years have passed since then. Over the same time, the average age of a nuclear weapon in the stockpile has increased from 6 years (1992) to nearly 29 years (2015). At its inception, achievement of the objectives of the Stockpile Stewardship Program (SSP) appeared possible but very difficult. The cost to design and construct several large facilities for precision experimentation in hydrodynamics and high energy density physics was large. The practical steps needed to move from computational platforms of less than 100 Mflops/sec to 10 Teraflops/sec and beyond were unknown. Today, most of the required facilities for SSP are in place and computational speed has been increased by more than six orders of magnitude. These, and the physicists and engineers in the complex of labs and plants within the National Nuclear Security Administration (NNSA) who put them in place, have been the basis for underpinning an annual decision, made by the weapons lab directors for each of the past 20 years, that resort to underground nuclear testing is not needed for maintaining confidence in the safety and reliability of the U.S stockpile. A key part of that decision has been annual assessment of the physical changes in stockpiled weapons. These weapons, quite simply, are systems that invariably and unstoppably age in the internal weapon environment of radioactive materials and complex interfaces of highly dissimilar organic and inorganic materials. Without an ongoing program to rebuild some components and replace other components to increase safety or security, i.e., life extending these weapons, either underground testing would again be required to assess many changes at once, or confidence in these weapons would be reduced. The strategy and details of the U.S. Stockpile Life Extension Program will be described in this talk. In brief, the strategy is to reduce the number of weapons in the stockpile while increasing confidence in the weapons that remain and, where possible, increase their safety, increase their security, and reduce their nuclear material quantities and yields. A number of ``myths'' pertaining to nuclear weapons, the SSP, and the Stockpile Life Extension Program will be explored.
Parallel Optimization of Polynomials for Large-scale Problems in Stability and Control
NASA Astrophysics Data System (ADS)
Kamyar, Reza
In this thesis, we focus on some of the NP-hard problems in control theory. Thanks to the converse Lyapunov theory, these problems can often be modeled as optimization over polynomials. To avoid the problem of intractability, we establish a trade off between accuracy and complexity. In particular, we develop a sequence of tractable optimization problems --- in the form of Linear Programs (LPs) and/or Semi-Definite Programs (SDPs) --- whose solutions converge to the exact solution of the NP-hard problem. However, the computational and memory complexity of these LPs and SDPs grow exponentially with the progress of the sequence - meaning that improving the accuracy of the solutions requires solving SDPs with tens of thousands of decision variables and constraints. Setting up and solving such problems is a significant challenge. The existing optimization algorithms and software are only designed to use desktop computers or small cluster computers --- machines which do not have sufficient memory for solving such large SDPs. Moreover, the speed-up of these algorithms does not scale beyond dozens of processors. This in fact is the reason we seek parallel algorithms for setting-up and solving large SDPs on large cluster- and/or super-computers. We propose parallel algorithms for stability analysis of two classes of systems: 1) Linear systems with a large number of uncertain parameters; 2) Nonlinear systems defined by polynomial vector fields. First, we develop a distributed parallel algorithm which applies Polya's and/or Handelman's theorems to some variants of parameter-dependent Lyapunov inequalities with parameters defined over the standard simplex. The result is a sequence of SDPs which possess a block-diagonal structure. We then develop a parallel SDP solver which exploits this structure in order to map the computation, memory and communication to a distributed parallel environment. Numerical tests on a supercomputer demonstrate the ability of the algorithm to efficiently utilize hundreds and potentially thousands of processors, and analyze systems with 100+ dimensional state-space. Furthermore, we extend our algorithms to analyze robust stability over more complicated geometries such as hypercubes and arbitrary convex polytopes. Our algorithms can be readily extended to address a wide variety of problems in control such as Hinfinity synthesis for systems with parametric uncertainty and computing control Lyapunov functions.
The effect of a complex training program on skating abilities in ice hockey players.
Lee, Changyoung; Lee, Sookyung; Yoo, Jaehyun
2014-04-01
[Purpose] Little data exist on systemic training programs to improve skating abilities in ice hockey players. The purpose of this study was to evaluate the effectiveness of a complex training program on skating abilities in ice hockey players. [Methods] Ten male ice hockey players (training group) that engaged in 12 weeks of complex training and skating training and ten male players (control group) that only participated in 12 weeks of skating training completed on-ice skating tests including a 5 time 18 meters shuttle, t-test, Rink dash 5 times, and line drill before, during, and the training. [Results] Significant group-by-time interactions were found in all skating ability tests. [Conclusion] The complex training program intervention for 12 weeks improved their skating abilities of the ice hockey players.
NASA Astrophysics Data System (ADS)
Pavao-Zuckerman, M.; Huxman, T.; Morehouse, B.
2008-12-01
Earth system and ecological sustainability problems are complex outcomes of biological, physical, social, and economic interactions. A common goal of outreach and education programs is to foster a scientifically literate community that possesses the knowledge to contribute to environmental policies and decision making. Uncertainty and variability that is both inherent in Earth system and ecological sciences can confound such goals of improved ecological literacy. Public programs provide an opportunity to engage lay-persons in the scientific method, allowing them to experience science in action and confront these uncertainties face-on. We begin with a definition of scientific literacy that expands its conceptualization of science beyond just a collection of facts and concepts to one that views science as a process to aid understanding of natural phenomena. A process-based scientific literacy allows the public, teachers, and students to assimilate new information, evaluate climate research, and to ultimately make decisions that are informed by science. The Biosphere 2 facility (B2) is uniquely suited for such outreach programs because it allows linking Earth system and ecological science research activities in a large scale controlled environment setting with outreach and education opportunities. A primary outreach goal is to demonstrate science in action to an audience that ranges from K-12 groups to retired citizens. Here we discuss approaches to outreach programs that focus on soil-water-atmosphere-plant interactions and their roles in the impacts and causes of global environmental change. We describe a suite of programs designed to vary the amount of participation a visitor has with the science process (from passive learning to data collection to helping design experiments) to test the hypothesis that active learning fosters increased scientific literacy and the creation of science advocates. We argue that a revised framing of the scientific method with a more open role for citizens in science will have greater success in fostering science literacy and produce a citizenry that is equipped to tackle complex environmental decision making.
The solution of the optimization problem of small energy complexes using linear programming methods
NASA Astrophysics Data System (ADS)
Ivanin, O. A.; Director, L. B.
2016-11-01
Linear programming methods were used for solving the optimization problem of schemes and operation modes of distributed generation energy complexes. Applicability conditions of simplex method, applied to energy complexes, including installations of renewable energy (solar, wind), diesel-generators and energy storage, considered. The analysis of decomposition algorithms for various schemes of energy complexes was made. The results of optimization calculations for energy complexes, operated autonomously and as a part of distribution grid, are presented.
NASA Astrophysics Data System (ADS)
Demir, I.; Krajewski, W. F.
2013-12-01
As geoscientists are confronted with increasingly massive datasets from environmental observations to simulations, one of the biggest challenges is having the right tools to gain scientific insight from the data and communicate the understanding to stakeholders. Recent developments in web technologies make it easy to manage, visualize and share large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to interact with data, and modify the parameters to create custom views of the data to gain insight from simulations and environmental observations. This requires developing new data models and intelligent knowledge discovery techniques to explore and extract information from complex computational simulations or large data repositories. Scientific visualization will be an increasingly important component to build comprehensive environmental information platforms. This presentation provides an overview of the trends and challenges in the field of scientific visualization, and demonstrates information visualization and communication tools developed within the light of these challenges.
Large-Scale Astrophysical Visualization on Smartphones
NASA Astrophysics Data System (ADS)
Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.
2011-07-01
Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.
Python for large-scale electrophysiology.
Spacek, Martin; Blanche, Tim; Swindale, Nicholas
2008-01-01
Electrophysiology is increasingly moving towards highly parallel recording techniques which generate large data sets. We record extracellularly in vivo in cat and rat visual cortex with 54-channel silicon polytrodes, under time-locked visual stimulation, from localized neuronal populations within a cortical column. To help deal with the complexity of generating and analysing these data, we used the Python programming language to develop three software projects: one for temporally precise visual stimulus generation ("dimstim"); one for electrophysiological waveform visualization and spike sorting ("spyke"); and one for spike train and stimulus analysis ("neuropy"). All three are open source and available for download (http://swindale.ecc.ubc.ca/code). The requirements and solutions for these projects differed greatly, yet we found Python to be well suited for all three. Here we present our software as a showcase of the extensive capabilities of Python in neuroscience.
NASA Technical Reports Server (NTRS)
Keller, Richard M.
1991-01-01
The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.
2016-03-18
SPONSORED REPORT SERIES Understanding Complexity and Self - Organization in a Defense Program Management Organization (Experimental Design...experiment will examine the decision-making process within the program office and the self - organization of key program office personnel based upon formal...and informal communications links. Additionally, we are interested in the effects of this self - organizing process on the organization’s shared
NASA Technical Reports Server (NTRS)
Craidon, C. B.
1983-01-01
A computer program was developed to extend the geometry input capabilities of previous versions of a supersonic zero lift wave drag computer program. The arbitrary geometry input description is flexible enough to describe almost any complex aircraft concept, so that highly accurate wave drag analysis can now be performed because complex geometries can be represented accurately and do not have to be modified to meet the requirements of a restricted input format.
A depth-first search algorithm to compute elementary flux modes by linear programming
2014-01-01
Background The decomposition of complex metabolic networks into elementary flux modes (EFMs) provides a useful framework for exploring reaction interactions systematically. Generating a complete set of EFMs for large-scale models, however, is near impossible. Even for moderately-sized models (<400 reactions), existing approaches based on the Double Description method must iterate through a large number of combinatorial candidates, thus imposing an immense processor and memory demand. Results Based on an alternative elementarity test, we developed a depth-first search algorithm using linear programming (LP) to enumerate EFMs in an exhaustive fashion. Constraints can be introduced to directly generate a subset of EFMs satisfying the set of constraints. The depth-first search algorithm has a constant memory overhead. Using flux constraints, a large LP problem can be massively divided and parallelized into independent sub-jobs for deployment into computing clusters. Since the sub-jobs do not overlap, the approach scales to utilize all available computing nodes with minimal coordination overhead or memory limitations. Conclusions The speed of the algorithm was comparable to efmtool, a mainstream Double Description method, when enumerating all EFMs; the attrition power gained from performing flux feasibility tests offsets the increased computational demand of running an LP solver. Unlike the Double Description method, the algorithm enables accelerated enumeration of all EFMs satisfying a set of constraints. PMID:25074068
HBLAST: Parallelised sequence similarity--A Hadoop MapReducable basic local alignment search tool.
O'Driscoll, Aisling; Belogrudov, Vladislav; Carroll, John; Kropp, Kai; Walsh, Paul; Ghazal, Peter; Sleator, Roy D
2015-04-01
The recent exponential growth of genomic databases has resulted in the common task of sequence alignment becoming one of the major bottlenecks in the field of computational biology. It is typical for these large datasets and complex computations to require cost prohibitive High Performance Computing (HPC) to function. As such, parallelised solutions have been proposed but many exhibit scalability limitations and are incapable of effectively processing "Big Data" - the name attributed to datasets that are extremely large, complex and require rapid processing. The Hadoop framework, comprised of distributed storage and a parallelised programming framework known as MapReduce, is specifically designed to work with such datasets but it is not trivial to efficiently redesign and implement bioinformatics algorithms according to this paradigm. The parallelisation strategy of "divide and conquer" for alignment algorithms can be applied to both data sets and input query sequences. However, scalability is still an issue due to memory constraints or large databases, with very large database segmentation leading to additional performance decline. Herein, we present Hadoop Blast (HBlast), a parallelised BLAST algorithm that proposes a flexible method to partition both databases and input query sequences using "virtual partitioning". HBlast presents improved scalability over existing solutions and well balanced computational work load while keeping database segmentation and recompilation to a minimum. Enhanced BLAST search performance on cheap memory constrained hardware has significant implications for in field clinical diagnostic testing; enabling faster and more accurate identification of pathogenic DNA in human blood or tissue samples. Copyright © 2015 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Riedler, Martina; Eryaman, Mustafa Yunus
2016-01-01
There is consensus in the literature that teacher education programs exhibit the characteristics of complex systems. These characteristics of teacher education programs as complex systems challenges the conventional, teacher-directed/ textbook-based positivist approaches in teacher education literature which has tried to reduce the complexities…
Matrix management in hospitals: testing theories of matrix structure and development.
Burns, L R
1989-09-01
A study of 315 hospitals with matrix management programs was used to test several hypotheses concerning matrix management advanced by earlier theorists. The study verifies that matrix management involves several distinctive elements that can be scaled to form increasingly complex types of lateral coordinative devices. The scalability of these elements is evident only cross-sectionally. The results show that matrix complexity is not an outcome of program age, nor does matrix complexity at the time of implementation appear to influence program survival. Matrix complexity, finally, is not determined by the organization's task diversity and uncertainty. The results suggest several modifications in prevailing theories of matrix organization.
Interactive Visualization of Complex Seismic Data and Models Using Bokeh
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chai, Chengping; Ammon, Charles J.; Maceira, Monica
Visualizing multidimensional data and models becomes more challenging as the volume and resolution of seismic data and models increase. But thanks to the development of powerful and accessible computer systems, a model web browser can be used to visualize complex scientific data and models dynamically. In this paper, we present four examples of seismic model visualization using an open-source Python package Bokeh. One example is a visualization of a surface-wave dispersion data set, another presents a view of three-component seismograms, and two illustrate methods to explore a 3D seismic-velocity model. Unlike other 3D visualization packages, our visualization approach has amore » minimum requirement on users and is relatively easy to develop, provided you have reasonable programming skills. Finally, utilizing familiar web browsing interfaces, the dynamic tools provide us an effective and efficient approach to explore large data sets and models.« less
Photo-realistic Terrain Modeling and Visualization for Mars Exploration Rover Science Operations
NASA Technical Reports Server (NTRS)
Edwards, Laurence; Sims, Michael; Kunz, Clayton; Lees, David; Bowman, Judd
2005-01-01
Modern NASA planetary exploration missions employ complex systems of hardware and software managed by large teams of. engineers and scientists in order to study remote environments. The most complex and successful of these recent projects is the Mars Exploration Rover mission. The Computational Sciences Division at NASA Ames Research Center delivered a 30 visualization program, Viz, to the MER mission that provides an immersive, interactive environment for science analysis of the remote planetary surface. In addition, Ames provided the Athena Science Team with high-quality terrain reconstructions generated with the Ames Stereo-pipeline. The on-site support team for these software systems responded to unanticipated opportunities to generate 30 terrain models during the primary MER mission. This paper describes Viz, the Stereo-pipeline, and the experiences of the on-site team supporting the scientists at JPL during the primary MER mission.
Macfarlane, Fraser; Greenhalgh, Trish; Humphrey, Charlotte; Hughes, Jane; Butler, Ceri; Pawson, Ray
2011-01-01
This paper seeks to describe the exploration of human resource issues in one large-scale program of innovation in healthcare. It is informed by established theories of management in the workplace and a multi-level model of diffusion of innovations. A realist approach was used based on interviews, ethnographic observation and documentary analysis. Five main approaches ("theories of change") were adopted to develop and support the workforce: recruiting staff with skills in service transformation; redesigning roles and creating new roles; enhancing workforce planning; linking staff development to service needs; creating opportunities for shared learning and knowledge exchange. Each had differing levels of success. The paper includes HR implications for the modernisation of a complex service organisation. This is the first time a realist evaluation of a complex health modernisation initiative has been undertaken.
Interactive Visualization of Complex Seismic Data and Models Using Bokeh
Chai, Chengping; Ammon, Charles J.; Maceira, Monica; ...
2018-02-14
Visualizing multidimensional data and models becomes more challenging as the volume and resolution of seismic data and models increase. But thanks to the development of powerful and accessible computer systems, a model web browser can be used to visualize complex scientific data and models dynamically. In this paper, we present four examples of seismic model visualization using an open-source Python package Bokeh. One example is a visualization of a surface-wave dispersion data set, another presents a view of three-component seismograms, and two illustrate methods to explore a 3D seismic-velocity model. Unlike other 3D visualization packages, our visualization approach has amore » minimum requirement on users and is relatively easy to develop, provided you have reasonable programming skills. Finally, utilizing familiar web browsing interfaces, the dynamic tools provide us an effective and efficient approach to explore large data sets and models.« less
MSFC Optical Metrology: A National Resource
NASA Technical Reports Server (NTRS)
Burdine, Robert
1998-01-01
A national need exists for Large Diameter Optical Metrology Services. These services include the manufacture, testing, and assurance of precision and control necessary to assure the success of large optical projects. "Best Practices" are often relied on for manufacture and quality controls while optical projects are increasingly more demanding and complex. Marshall Space Flight Center (MSFC) has acquired unique optical measurement, testing and metrology capabilities through active participation in a wide variety of NASA optical programs. An overview of existing optical facilities and metrology capabilities is given with emphasis on use by other optical projects. Cost avoidance and project success is stressed through use of existing MSFC facilities and capabilities for measurement and metrology controls. Current issues in large diameter optical metrology are briefly reviewed. The need for a consistent and long duration Large Diameter Optical Metrology Service Group is presented with emphasis on the establishment of a National Large Diameter Optical Standards Laboratory. Proposals are made to develop MSFC optical standards and metrology capabilities as the primary national standards resource, providing access to MSFC Optical Core Competencies for manufacturers and researchers. Plans are presented for the development of a national lending library of precision optical standards with emphasis on cost avoidance while improving measurement assurance.
NASA Technical Reports Server (NTRS)
Blair, M. F.
1991-01-01
A combined experimental and computational program was conducted to examine the heat transfer distribution in a turbine rotor passage geometrically similar to the Space Shuttle Main Engine (SSME) High Pressure Fuel Turbopump (HPFTP). Heat transfer was measured and computed for both the full span suction and pressure surfaces of the rotor airfoil as well as for the hub endwall surface. The objective of the program was to provide a benchmark-quality database for the assessment of rotor heat transfer computational techniques. The experimental portion of the study was conducted in a large scale, ambient temperature, rotating turbine model. The computational portion consisted of the application of a well-posed parabolized Navier-Stokes analysis of the calculation of the three-dimensional viscous flow through ducts simulating a gas turbine package. The results of this assessment indicate that the procedure has the potential to predict the aerodynamics and the heat transfer in a gas turbine passage and can be used to develop detailed three dimensional turbulence models for the prediction of skin friction and heat transfer in complex three dimensional flow passages.
Gniadkowski, Marek; Pałucha, Andrzej; Grzesiowski, Paweł; Hryniewicz, Waleria
1998-01-01
In 1996 a large, 300-bed pediatric hospital in Warsaw, Poland, started a program of monitoring infections caused by extended-spectrum β-lactamase (ESBL)-producing microorganisms. Over the first 3-month period eight Klebsiella pneumoniae isolates were identified as being resistant to ceftazidime. Six of these were found to produce the TEM-47 ESBL, which we first described in a K. pneumoniae strain recovered a year before in a pediatric hospital in Łódź, Poland, which is 140 km from Warsaw. Typing results revealed a very close relatedness among all these isolates, which suggested that the clonal outbreak in Warsaw was caused by a strain possibly imported from Łódź. The remaining two isolates expressed the SHV-5-like ESBL, which resulted from the horizontal transfer of a plasmid carrying the blaSHV gene between nonrelated strains. The data presented here exemplify the complexity of the epidemiological situation concerning ESBL producers typical for large Polish hospitals, in which no ESBL-monitoring programs were in place prior to 1995. PMID:9835494
Automated Planning Enables Complex Protocols on Liquid-Handling Robots.
Whitehead, Ellis; Rudolf, Fabian; Kaltenbach, Hans-Michael; Stelling, Jörg
2018-03-16
Robotic automation in synthetic biology is especially relevant for liquid handling to facilitate complex experiments. However, research tasks that are not highly standardized are still rarely automated in practice. Two main reasons for this are the substantial investments required to translate molecular biological protocols into robot programs, and the fact that the resulting programs are often too specific to be easily reused and shared. Recent developments of standardized protocols and dedicated programming languages for liquid-handling operations addressed some aspects of ease-of-use and portability of protocols. However, either they focus on simplicity, at the expense of enabling complex protocols, or they entail detailed programming, with corresponding skills and efforts required from the users. To reconcile these trade-offs, we developed Roboliq, a software system that uses artificial intelligence (AI) methods to integrate (i) generic formal, yet intuitive, protocol descriptions, (ii) complete, but usually hidden, programming capabilities, and (iii) user-system interactions to automatically generate executable, optimized robot programs. Roboliq also enables high-level specifications of complex tasks with conditional execution. To demonstrate the system's benefits for experiments that are difficult to perform manually because of their complexity, duration, or time-critical nature, we present three proof-of-principle applications for the reproducible, quantitative characterization of GFP variants.
Facebook Advertising Across an Engagement Spectrum: A Case Example for Public Health Communication.
Platt, Tevah; Platt, Jodyn; Thiel, Daniel B; Kardia, Sharon L R
2016-05-30
The interpersonal, dialogic features of social networking sites have untapped potential for public health communication. We ran a Facebook advertising campaign to raise statewide awareness of Michigan's newborn screening and biobanking programs. We ran a Facebook advertising campaign to stimulate public engagement on the complex and sensitive issue of Michigan's newborn screening and biobank programs. We ran an 11-week, US $15,000 Facebook advertising campaign engaging Michigan Facebook users aged 18-64 years about the state's newborn screening and population biobank programs, and we used a novel "engagement spectrum" framework to contextualize and evaluate engagement outcomes ranging from observation to multi-way conversation. The campaign reached 1.88 million Facebook users, yielding a range of engagement outcomes across ad sets that varied by objective, content, budget, duration, and bid type. Ad sets yielded 9009 page likes (US $4125), 15,958 website clicks (US $5578), and 12,909 complete video views to 100% (US $3750). "Boosted posts" yielded 528 comments and 35,966 page post engagements (US $1500). Overall, the campaign led to 452 shares and 642 comments, including 176 discussing newborn screening and biobanking. Facebook advertising campaigns can efficiently reach large populations and achieve a range of engagement outcomes by diversifying ad types, bid types, and content. This campaign provided a population-based approach to communication that also increased transparency on a sensitive and complex topic by creating a forum for multi-way interaction.
Facebook Advertising Across an Engagement Spectrum: A Case Example for Public Health Communication
Platt, Jodyn; Thiel, Daniel B; Kardia, Sharon L. R
2016-01-01
Background The interpersonal, dialogic features of social networking sites have untapped potential for public health communication. We ran a Facebook advertising campaign to raise statewide awareness of Michigan’s newborn screening and biobanking programs. Objective We ran a Facebook advertising campaign to stimulate public engagement on the complex and sensitive issue of Michigan’s newborn screening and biobank programs. Methods We ran an 11-week, US $15,000 Facebook advertising campaign engaging Michigan Facebook users aged 18-64 years about the state’s newborn screening and population biobank programs, and we used a novel “engagement spectrum” framework to contextualize and evaluate engagement outcomes ranging from observation to multi-way conversation. Results The campaign reached 1.88 million Facebook users, yielding a range of engagement outcomes across ad sets that varied by objective, content, budget, duration, and bid type. Ad sets yielded 9009 page likes (US $4125), 15,958 website clicks (US $5578), and 12,909 complete video views to 100% (US $3750). “Boosted posts” yielded 528 comments and 35,966 page post engagements (US $1500). Overall, the campaign led to 452 shares and 642 comments, including 176 discussing newborn screening and biobanking. Conclusions Facebook advertising campaigns can efficiently reach large populations and achieve a range of engagement outcomes by diversifying ad types, bid types, and content. This campaign provided a population-based approach to communication that also increased transparency on a sensitive and complex topic by creating a forum for multi-way interaction. PMID:27244774
Visualizing the Complex Process for Deep Learning with an Authentic Programming Project
ERIC Educational Resources Information Center
Peng, Jun; Wang, Minhong; Sampson, Demetrios
2017-01-01
Project-based learning (PjBL) has been increasingly used to connect abstract knowledge and authentic tasks in educational practice, including computer programming education. Despite its promising effects on improving learning in multiple aspects, PjBL remains a struggle due to its complexity. Completing an authentic programming project involves a…
VIV analysis of pipelines under complex span conditions
NASA Astrophysics Data System (ADS)
Wang, James; Steven Wang, F.; Duan, Gang; Jukes, Paul
2009-06-01
Spans occur when a pipeline is laid on a rough undulating seabed or when upheaval buckling occurs due to constrained thermal expansion. This not only results in static and dynamic loads on the flowline at span sections, but also generates vortex induced vibration (VIV), which can lead to fatigue issues. The phenomenon, if not predicted and controlled properly, will negatively affect pipeline integrity, leading to expensive remediation and intervention work. Span analysis can be complicated by: long span lengths, a large number of spans caused by a rough seabed, and multi-span interactions. In addition, the complexity can be more onerous and challenging when soil uncertainty, concrete degradation and unknown residual lay tension are considered in the analysis. This paper describes the latest developments and a ‘state-of-the-art’ finite element analysis program that has been developed to simulate the span response of a flowline under complex boundary and loading conditions. Both VIV and direct wave loading are captured in the analysis and the results are sequentially used for the ultimate limit state (ULS) check and fatigue life calculation.
Robustness Elasticity in Complex Networks
Matisziw, Timothy C.; Grubesic, Tony H.; Guo, Junyu
2012-01-01
Network robustness refers to a network’s resilience to stress or damage. Given that most networks are inherently dynamic, with changing topology, loads, and operational states, their robustness is also likely subject to change. However, in most analyses of network structure, it is assumed that interaction among nodes has no effect on robustness. To investigate the hypothesis that network robustness is not sensitive or elastic to the level of interaction (or flow) among network nodes, this paper explores the impacts of network disruption, namely arc deletion, over a temporal sequence of observed nodal interactions for a large Internet backbone system. In particular, a mathematical programming approach is used to identify exact bounds on robustness to arc deletion for each epoch of nodal interaction. Elasticity of the identified bounds relative to the magnitude of arc deletion is assessed. Results indicate that system robustness can be highly elastic to spatial and temporal variations in nodal interactions within complex systems. Further, the presence of this elasticity provides evidence that a failure to account for nodal interaction can confound characterizations of complex networked systems. PMID:22808060
The Effect of a Complex Training Program on Skating Abilities in Ice Hockey Players
Lee, Changyoung; Lee, Sookyung; Yoo, Jaehyun
2014-01-01
[Purpose] Little data exist on systemic training programs to improve skating abilities in ice hockey players. The purpose of this study was to evaluate the effectiveness of a complex training program on skating abilities in ice hockey players. [Methods] Ten male ice hockey players (training group) that engaged in 12 weeks of complex training and skating training and ten male players (control group) that only participated in 12 weeks of skating training completed on-ice skating tests including a 5 time 18 meters shuttle, t-test, Rink dash 5 times, and line drill before, during, and the training. [Results] Significant group-by-time interactions were found in all skating ability tests. [Conclusion] The complex training program intervention for 12 weeks improved their skating abilities of the ice hockey players. PMID:24764628
Screening procedure for airborne pollutants emitted from a high-tech industrial complex in Taiwan.
Wang, John H C; Tsai, Ching-Tsan; Chiang, Chow-Feng
2015-11-01
Despite the modernization of computational techniques, atmospheric dispersion modeling remains a complicated task as it involves the use of large amounts of interrelated data with wide variability. The continuously growing list of regulated air pollutants also increases the difficulty of this task. To address these challenges, this study aimed to develop a screening procedure for a long-term exposure scenario by generating a site-specific lookup table of hourly averaged dispersion factors (χ/Q), which could be evaluated by downwind distance, direction, and effective plume height only. To allow for such simplification, the average plume rise was weighted with the frequency distribution of meteorological data so that the prediction of χ/Q could be decoupled from the meteorological data. To illustrate this procedure, 20 receptors around a high-tech complex in Taiwan were selected. Five consecutive years of hourly meteorological data were acquired to generate a lookup table of χ/Q, as well as two regression formulas of plume rise as functions of downwind distance, buoyancy flux, and stack height. To calculate the concentrations for the selected receptors, a six-step Excel algorithm was programmed with four years of emission records and 10 most critical toxics were screened out. A validation check using Industrial Source Complex (ISC3) model with the same meteorological and emission data showed an acceptable overestimate of 6.7% in the average concentration of 10 nearby receptors. The procedure proposed in this study allows practical and focused emission management for a large industrial complex and can therefore be integrated into an air quality decision-making system. Copyright © 2015 Elsevier Ltd. All rights reserved.
Molecular Line Lists for Scandium and Titanium Hydride Using the DUO Program
NASA Astrophysics Data System (ADS)
Lodi, Lorenzo; Yurchenko, Sergei N.; Tennyson, Jonathan
2015-06-01
Transition-metal-containing (TMC) molecules often have very complex electronic spectra because of their large number of low-lying, interacting electronic states, of the large multi-reference character of the electronic states and of the large magnitude of spin-orbit and relativistic effects. As a result, fully ab initio calculations of line positions and intensities of TMC molecules have an accuracy which is considerably worse than the one usually achievable for molecules made up by main-group atoms only. In this presentation we report on new theoretical line lists for scandium hydride ScH and titanium hydride TiH. Scandium and titanium are the lightest transition metal atoms and by virtue of their small number of valence electrons are amenable to high-level electronic-structure treatments and serve as ideal benchmark systems. We report for both systems energy curves, dipole curves and various coupling curves (including spin-orbit) characterising their electronic spectra up to about 20 000 cm-1. Curves were obtained using Internally-Contracted Multi Reference Configuration Interaction (IC-MRCI) as implemented in the quantum chemistry package MOLPRO. The curves where used for the solution of the coupled-surface ro-vibronic problem using the in-house program DUO. DUO is a newly-developed, general program for the spectroscopy of diatomic molecules and its main functionality will be described. The resulting line lists for ScH and TiH are made available as part of the Exomol project. L. Lodi, S. N. Yurchenko and J. Tennyson, Mol. Phys. (Handy special issue) in press. S. N. Yurchenko, L. Lodi, J. Tennyson and A. V. Stolyarov, Computer Phys. Comms., to be submitted.
Complex Burn Region Module (CBRM) update
NASA Technical Reports Server (NTRS)
Adams, Carl L.; Jenkins, Billy
1991-01-01
Presented here is a Complex Burn Region Module (CBRM) update for the Solid Rocket Internal Ballistics Module (SRIBM) Program for the Advanced Solid Rocket Motor (ASRM) design/performance assessments. The goal was to develop an improved version of the solid rocket internal ballistics module program that contains a diversified complex region model for motor grain design, performance prediction, and evaluation.
Singh, Prafull Kumar; Roukounakis, Aristomenis; Frank, Daniel O.; Kirschnek, Susanne; Das, Kushal Kumar; Neumann, Simon; Madl, Josef; Römer, Winfried; Zorzin, Carina; Borner, Christoph; Haimovici, Aladin; Garcia-Saez, Ana; Weber, Arnim; Häcker, Georg
2017-01-01
The Bcl-2 family protein Bim triggers mitochondrial apoptosis. Bim is expressed in nonapoptotic cells at the mitochondrial outer membrane, where it is activated by largely unknown mechanisms. We found that Bim is regulated by formation of large protein complexes containing dynein light chain 1 (DLC1). Bim rapidly inserted into cardiolipin-containing membranes in vitro and recruited DLC1 to the membrane. Bim binding to DLC1 induced the formation of large Bim complexes on lipid vesicles, on isolated mitochondria, and in intact cells. Native gel electrophoresis and gel filtration showed Bim-containing mitochondrial complexes of several hundred kilodaltons in all cells tested. Bim unable to form complexes was consistently more active than complexed Bim, which correlated with its substantially reduced binding to anti-apoptotic Bcl-2 proteins. At endogenous levels, Bim surprisingly bound only anti-apoptotic Mcl-1 but not Bcl-2 or Bcl-XL, recruiting only Mcl-1 into large complexes. Targeting of DLC1 by RNAi in human cell lines induced disassembly of Bim–Mcl-1 complexes and the proteasomal degradation of Mcl-1 and sensitized the cells to the Bcl-2/Bcl-XL inhibitor ABT-737. Regulation of apoptosis at mitochondria thus extends beyond the interaction of monomers of proapoptotic and anti-apoptotic Bcl-2 family members but involves more complex structures of proteins at the mitochondrial outer membrane, and targeting complexes may be a novel therapeutic strategy. PMID:28982759
Different Evolutionary Paths to Complexity for Small and Large Populations of Digital Organisms
2016-01-01
A major aim of evolutionary biology is to explain the respective roles of adaptive versus non-adaptive changes in the evolution of complexity. While selection is certainly responsible for the spread and maintenance of complex phenotypes, this does not automatically imply that strong selection enhances the chance for the emergence of novel traits, that is, the origination of complexity. Population size is one parameter that alters the relative importance of adaptive and non-adaptive processes: as population size decreases, selection weakens and genetic drift grows in importance. Because of this relationship, many theories invoke a role for population size in the evolution of complexity. Such theories are difficult to test empirically because of the time required for the evolution of complexity in biological populations. Here, we used digital experimental evolution to test whether large or small asexual populations tend to evolve greater complexity. We find that both small and large—but not intermediate-sized—populations are favored to evolve larger genomes, which provides the opportunity for subsequent increases in phenotypic complexity. However, small and large populations followed different evolutionary paths towards these novel traits. Small populations evolved larger genomes by fixing slightly deleterious insertions, while large populations fixed rare beneficial insertions that increased genome size. These results demonstrate that genetic drift can lead to the evolution of complexity in small populations and that purifying selection is not powerful enough to prevent the evolution of complexity in large populations. PMID:27923053
Challenges and Demands on Automated Software Revision
NASA Technical Reports Server (NTRS)
Bonakdarpour, Borzoo; Kulkarni, Sandeep S.
2008-01-01
In the past three decades, automated program verification has undoubtedly been one of the most successful contributions of formal methods to software development. However, when verification of a program against a logical specification discovers bugs in the program, manual manipulation of the program is needed in order to repair it. Thus, in the face of existence of numerous unverified and un- certified legacy software in virtually any organization, tools that enable engineers to automatically verify and subsequently fix existing programs are highly desirable. In addition, since requirements of software systems often evolve during the software life cycle, the issue of incomplete specification has become a customary fact in many design and development teams. Thus, automated techniques that revise existing programs according to new specifications are of great assistance to designers, developers, and maintenance engineers. As a result, incorporating program synthesis techniques where an algorithm generates a program, that is correct-by-construction, seems to be a necessity. The notion of manual program repair described above turns out to be even more complex when programs are integrated with large collections of sensors and actuators in hostile physical environments in the so-called cyber-physical systems. When such systems are safety/mission- critical (e.g., in avionics systems), it is essential that the system reacts to physical events such as faults, delays, signals, attacks, etc, so that the system specification is not violated. In fact, since it is impossible to anticipate all possible such physical events at design time, it is highly desirable to have automated techniques that revise programs with respect to newly identified physical events according to the system specification.
Sunflower Hybrid Breeding: From Markers to Genomic Selection
Dimitrijevic, Aleksandra; Horn, Renate
2018-01-01
In sunflower, molecular markers for simple traits as, e.g., fertility restoration, high oleic acid content, herbicide tolerance or resistances to Plasmopara halstedii, Puccinia helianthi, or Orobanche cumana have been successfully used in marker-assisted breeding programs for years. However, agronomically important complex quantitative traits like yield, heterosis, drought tolerance, oil content or selection for disease resistance, e.g., against Sclerotinia sclerotiorum have been challenging and will require genome-wide approaches. Plant genetic resources for sunflower are being collected and conserved worldwide that represent valuable resources to study complex traits. Sunflower association panels provide the basis for genome-wide association studies, overcoming disadvantages of biparental populations. Advances in technologies and the availability of the sunflower genome sequence made novel approaches on the whole genome level possible. Genotype-by-sequencing, and whole genome sequencing based on next generation sequencing technologies facilitated the production of large amounts of SNP markers for high density maps as well as SNP arrays and allowed genome-wide association studies and genomic selection in sunflower. Genome wide or candidate gene based association studies have been performed for traits like branching, flowering time, resistance to Sclerotinia head and stalk rot. First steps in genomic selection with regard to hybrid performance and hybrid oil content have shown that genomic selection can successfully address complex quantitative traits in sunflower and will help to speed up sunflower breeding programs in the future. To make sunflower more competitive toward other oil crops higher levels of resistance against pathogens and better yield performance are required. In addition, optimizing plant architecture toward a more complex growth type for higher plant densities has the potential to considerably increase yields per hectare. Integrative approaches combining omic technologies (genomics, transcriptomics, proteomics, metabolomics and phenomics) using bioinformatic tools will facilitate the identification of target genes and markers for complex traits and will give a better insight into the mechanisms behind the traits. PMID:29387071
Intrinsically Disordered Proteins and the Origins of Multicellular Organisms
NASA Astrophysics Data System (ADS)
Dunker, A. Keith
In simple multicellular organisms all of the cells are in direct contact with the surrounding milieu, whereas in complex multicellular organisms some cells are completely surrounded by other cells. Current phylogenetic trees indicate that complex multicellular organisms evolved independently from unicellular ancestors about 10 times, and only among the eukaryotes, including once for animals, twice each for green, red, and brown algae, and thrice for fungi. Given these multiple independent evolutionary lineages, we asked two questions: 1. Which molecular functions underpinned the evolution of multicellular organisms?; and, 2. Which of these molecular functions depend on intrinsically disordered proteins (IDPs)? Compared to unicellularity, multicellularity requires the advent of molecules for cellular adhesion, for cell-cell communication and for developmental programs. In addition, the developmental programs need to be regulated over space and time. Finally, each multicellular organism has cell-specific biochemistry and physiology. Thus, the evolution of complex multicellular organisms from unicellular ancestors required five new classes of functions. To answer the second question we used Key-words in Swiss Protein ranked for associations with predictions of protein structure or disorder. With a Z-score of 18.8 compared to random-function proteins, à differentiation was the biological process most strongly associated with IDPs. As expected from this result, large numbers of individual proteins associated with differentiation exhibit substantial regions of predicted disorder. For the animals for which there is the most readily available data all five of the underpinning molecular functions for multicellularity were found to depend critically on IDP-based mechanisms and other evidence supports these ideas. While the data are more sparse, IDPs seem to similarly underlie the five new classes of functions for plants and fungi as well, suggesting that IDPs were indeed crucial for the evolution of complex multicellular organisms. These new findings necessitate a rethinking of the gene regulatory network models currently used to explain cellular differentiation and the evolution of complex multicellular organisms.
Approximate kernel competitive learning.
Wu, Jian-Sheng; Zheng, Wei-Shi; Lai, Jian-Huang
2015-03-01
Kernel competitive learning has been successfully used to achieve robust clustering. However, kernel competitive learning (KCL) is not scalable for large scale data processing, because (1) it has to calculate and store the full kernel matrix that is too large to be calculated and kept in the memory and (2) it cannot be computed in parallel. In this paper we develop a framework of approximate kernel competitive learning for processing large scale dataset. The proposed framework consists of two parts. First, it derives an approximate kernel competitive learning (AKCL), which learns kernel competitive learning in a subspace via sampling. We provide solid theoretical analysis on why the proposed approximation modelling would work for kernel competitive learning, and furthermore, we show that the computational complexity of AKCL is largely reduced. Second, we propose a pseudo-parallelled approximate kernel competitive learning (PAKCL) based on a set-based kernel competitive learning strategy, which overcomes the obstacle of using parallel programming in kernel competitive learning and significantly accelerates the approximate kernel competitive learning for large scale clustering. The empirical evaluation on publicly available datasets shows that the proposed AKCL and PAKCL can perform comparably as KCL, with a large reduction on computational cost. Also, the proposed methods achieve more effective clustering performance in terms of clustering precision against related approximate clustering approaches. Copyright © 2014 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babu, Sudarsanam Suresh; Love, Lonnie J.; Peter, William H.
Additive manufacturing (AM) is considered an emerging technology that is expected to transform the way industry can make low-volume, high value complex structures. This disruptive technology promises to replace legacy manufacturing methods for the fabrication of existing components in addition to bringing new innovation for new components with increased functional and mechanical properties. This report outlines the outcome of a workshop on large-scale metal additive manufacturing held at Oak Ridge National Laboratory (ORNL) on March 11, 2016. The charter for the workshop was outlined by the Department of Energy (DOE) Advanced Manufacturing Office program manager. The status and impact ofmore » the Big Area Additive Manufacturing (BAAM) for polymer matrix composites was presented as the background motivation for the workshop. Following, the extension of underlying technology to low-cost metals was proposed with the following goals: (i) High deposition rates (approaching 100 lbs/h); (ii) Low cost (<$10/lbs) for steel, iron, aluminum, nickel, as well as, higher cost titanium, (iii) large components (major axis greater than 6 ft) and (iv) compliance of property requirements. The above concept was discussed in depth by representatives from different industrial sectors including welding, metal fabrication machinery, energy, construction, aerospace and heavy manufacturing. In addition, DOE’s newly launched High Performance Computing for Manufacturing (HPC4MFG) program was reviewed. This program will apply thermo-mechanical models to elucidate deeper understanding of the interactions between design, process, and materials during additive manufacturing. Following these presentations, all the attendees took part in a brainstorming session where everyone identified the top 10 challenges in large-scale metal AM from their own perspective. The feedback was analyzed and grouped in different categories including, (i) CAD to PART software, (ii) selection of energy source, (iii) systems development, (iv) material feedstock, (v) process planning, (vi) residual stress & distortion, (vii) post-processing, (viii) qualification of parts, (ix) supply chain and (x) business case. Furthermore, an open innovation network methodology was proposed to accelerate the development and deployment of new large-scale metal additive manufacturing technology with the goal of creating a new generation of high deposition rate equipment, affordable feed stocks, and large metallic components to enhance America’s economic competitiveness.« less
Precedent approach to the formation of programs for cyclic objects control
NASA Astrophysics Data System (ADS)
Kulakov, S. M.; Trofimov, V. B.; Dobrynin, A. S.; Taraborina, E. N.
2018-05-01
The idea and procedure for formalizing the precedent method of formation of complex control solutions (complex control programs) is discussed with respect to technological or organizational objects, the operation of which is organized cyclically. A typical functional structure of the system of precedent control by complex technological unit is developed, including a subsystem of retrospective optimization of actually implemented control programs. As an example, the problem of constructing replaceable planograms for the operation of the link of a heading-and-winning machine on the basis of precedents is considered.
Developing Leadership for Increasing Complexity: A Review of Online Graduate Leadership Programs
ERIC Educational Resources Information Center
Winton, Steven L.; Palmer, Sarah; Hughes, Patrick J.
2018-01-01
Leadership education must evolve to keep pace with the growing recognition that effective leadership happens in a complex environment and is as much a systemic variable as a personal one. As part of a program review process, a graduate leadership program at a private Midwestern university conducted a qualitative review of 18 online graduate…
Has First-Grade Core Reading Program Text Complexity Changed across Six Decades?
ERIC Educational Resources Information Center
Fitzgerald, Jill; Elmore, Jeff; Relyea, Jackie Eunjung; Hiebert, Elfrieda H.; Stenner, A. Jackson
2016-01-01
The purpose of the study was to address possible text complexity shifts across the past six decades for a continually best-selling first-grade core reading program. The anthologies of one publisher's seven first-grade core reading programs were examined using computer-based analytics, dating from 1962 to 2013. Variables were Overall Text…
Complex System Governance for Acquisition
2016-04-30
2014, September–October). Cybersecurity challenges for program managers . Defense AT&L. Naphade, M., Banavar, G., Harrison, C., Paraszczak, J...the Acquisition Research Program of the Graduate School of Business & Public Policy at the Naval Postgraduate School. To request defense...Dickmann, Vice President, Sonalysts Inc. A Complex Systems Perspective of Risk Mitigation and Modeling in Development and Acquisition Programs Roshanak
ERIC Educational Resources Information Center
Fuwa, Minori; Kayama, Mizue; Kunimune, Hisayoshi; Hashimoto, Masami; Asano, David K.
2015-01-01
We have explored educational methods for algorithmic thinking for novices and implemented a block programming editor and a simple learning management system. In this paper, we propose a program/algorithm complexity metric specified for novice learners. This metric is based on the variable usage in arithmetic and relational formulas in learner's…
Life as a graduate student in a globalized collaboration
NASA Astrophysics Data System (ADS)
Fracchiolla, Claudia
2009-05-01
A global vision is important, if not essential, in all scientific fields. In the case of graduate students, the language of instruction is not the only issue. We must learn different research methodologies and understand a new set of complex cultural dynamics both in our living situations and in our new university workplaces. My research program is in experimental particle astrophysics. I study ultra-high energy cosmic rays with the Pierre Auger Observatory located in Argentina. More than 400 scientists from 18 different countries are a part of this science program. As a graduate student within this model provides me with a comprehensive understanding of global cultures combined with research skills, proficiency in different languages, and an international experience. I will discuss the benefits and challenges of working in a large international collaboration, and how it can help you grow not only as a scientist, but also as a person.
Optimization of municipal pressure pumping station layout and sewage pipe network design
NASA Astrophysics Data System (ADS)
Tian, Jiandong; Cheng, Jilin; Gong, Yi
2018-03-01
Accelerated urbanization places extraordinary demands on sewer networks; thus optimization research to improve the design of these systems has practical significance. In this article, a subsystem nonlinear programming model is developed to optimize pumping station layout and sewage pipe network design. The subsystem model is expanded into a large-scale complex nonlinear programming system model to find the minimum total annual cost of the pumping station and network of all pipe segments. A comparative analysis is conducted using the sewage network in Taizhou City, China, as an example. The proposed method demonstrated that significant cost savings could have been realized if the studied system had been optimized using the techniques described in this article. Therefore, the method has practical value for optimizing urban sewage projects and provides a reference for theoretical research on optimization of urban drainage pumping station layouts.
System analysis in rotorcraft design: The past decade
NASA Technical Reports Server (NTRS)
Galloway, Thomas L.
1988-01-01
Rapid advances in the technology of electronic digital computers and the need for an integrated synthesis approach in developing future rotorcraft programs has led to increased emphasis on system analysis techniques in rotorcraft design. The task in systems analysis is to deal with complex, interdependent, and conflicting requirements in a structured manner so rational and objective decisions can be made. Whether the results are wisdom or rubbish depends upon the validity and sometimes more importantly, the consistency of the inputs, the correctness of the analysis, and a sensible choice of measures of effectiveness to draw conclusions. In rotorcraft design this means combining design requirements, technology assessment, sensitivity analysis and reviews techniques currently in use by NASA and Army organizations in developing research programs and vehicle specifications for rotorcraft. These procedures span simple graphical approaches to comprehensive analysis on large mainframe computers. Examples of recent applications to military and civil missions are highlighted.
The role of citzens in detecting and responding to a rapid marine invasion
Scyphers, Stephen B.; Powers, Sean P.; Akins, J. Lad; Drymon, J. Marcus; Martin, Charles M.; Schobernd, Zeb H.; Schofield, Pamela J.; Shipp, Robert L.; Switzer, Theodore S.
2015-01-01
Documenting and responding to species invasions requires innovative strategies that account for ecological and societal complexities. We used the recent expansion of Indo-Pacific lionfish (Pterois volitans/miles) throughout northern Gulf of Mexico coastal waters to evaluate the role of stakeholders in documenting and responding to a rapid marine invasion. We coupled an online survey of spearfishers and citizen science monitoring programs with traditional fishery-independent data sources and found that citizen observations documented lionfish 1–2 years earlier and more frequently than traditional reef fish monitoring programs. Citizen observations first documented lionfish in 2010 followed by rapid expansion and proliferation in 2011 (+367%). From the survey of spearfishers, we determined that diving experience and personal observations of lionfish strongly influenced perceived impacts, and these perceptions were powerful predictors of support for initiatives. Our study demonstrates the value of engaging citizens for assessing and responding to large-scale and time-sensitive conservation problems.
Systems survivor: a program for house staff in systems-based practice.
Turley, Christine B; Roach, Richard; Marx, Marilyn
2007-01-01
The Systems-Based Practice competency expanded the scope of graduate medical education. Innovative approaches are needed to teach this material. We have designed and implemented a rotation in Systems-Based Practice focused on the interrelationships of patient care, clinical revenue, and the physician's role within health care systems. Experiential learning occurs during a 5-day rotation through 26 areas encompassing the clinical revenue cycle, guided by "expert" staff. Using a reversal of the TV show Survivor, house staff begin conceptually "alone" and discover they are members of a large, dedicated team. Assessment results, including a system knowledge test and course evaluations, are presented. Twenty-five residents from four clinical departments participated in Year 1. An increase in pretest to posttest knowledge scores of 14.8% (p
Mental health care use by soldiers conducting counterinsurgency operations.
Applewhite, Larry; Keller, Nathan; Borah, Adam
2012-05-01
Counterinsurgency (COIN) has become the cornerstone of the military's strategy to combat terrorist threats. COIN operations are complex and often expose soldiers to unfamiliar stressors as they fight the enemy while developing and maintaining rapport with the local populace. Utilizing a retrospective record review protocol, we examined 282 mental health files of soldiers assigned to a brigade combat team that operated from a large forward operating base in Iraq during the counterinsurgency campaign. Most reported sleep disturbance, depression, anxiety, irritability, and conflict with supervisors related to either operational stress, exposure to direct combat, or home front concerns. Most received brief individual supportive therapy or attended solution-focused group counseling emphasizing life skills training, post-traumatic stress treatment, women's support, or relationship skills. Psychopharmacologic treatment was an essential adjunct to the counseling program. Results indicate that supporting a COIN deployment requires a comprehensive mental health program that can respond to a wide range of mental health problems.
NASA Technical Reports Server (NTRS)
Roth, J. P.
1972-01-01
The following problems are considered: (1) methods for development of logic design together with algorithms, so that it is possible to compute a test for any failure in the logic design, if such a test exists, and developing algorithms and heuristics for the purpose of minimizing the computation for tests; and (2) a method of design of logic for ultra LSI (large scale integration). It was discovered that the so-called quantum calculus can be extended to render it possible: (1) to describe the functional behavior of a mechanism component by component, and (2) to compute tests for failures, in the mechanism, using the diagnosis algorithm. The development of an algorithm for the multioutput two-level minimization problem is presented and the program MIN 360 was written for this algorithm. The program has options of mode (exact minimum or various approximations), cost function, cost bound, etc., providing flexibility.
Management of optics. [for HEAO-2 X ray telescope
NASA Technical Reports Server (NTRS)
Kirchner, T. E.; Russell, M.
1981-01-01
American Science and Engineering, Inc., designed the large X-ray optic for the HEAO-2 X-ray Telescope. The key element in this project was the High Resolution Mirror Assembly (HRMA), subcontracting the fabrication of the optical surfaces and their assembly and alignment. The roles and organization of the key participants in the creation of HRMA are defined, and the degree of interaction between the groups is described. Management of this effort was extremely complex because of the intricate weaving of responsibilities, and AS&E, as HEAO-2 Program managers, needed to be well versed in the scientific objectives, the technical requirements, the program requirements, and the subcontract management. Understanding these factors was essential for implementing both technical and management controls, such as schedule and budget constraints, in-process control, residence requirements, and scientist review and feedback. Despite unforeseen technical problems and interaction differences, the HEAO-2 was built on schedule and to specification.
Near-field measurement facility plans at Lewis Research Center
NASA Technical Reports Server (NTRS)
Sharp, R. G.
1983-01-01
The direction of future antenna technology will be toward antennas which are large, both physically and electrically, will operate at frequencies up to 60 GHz, and are non-reciprocal and complex, implementing multiple-beam and scanning beam concepts and monolithic semiconductor devices and techniques. The acquisition of accurate antenna performance measurements is a critical part of the advanced antenna research program and represents a substantial antenna measurement technology challenge, considering the special characteristics of future spacecraft communications antennas. Comparison of various antenna testing techniques and their relative advantages and disadvantages shows that the near-field approach is necessary to meet immediate and long-term testing requirements. The LeRC facilities, the 22 ft x 22 ft horizontal antenna boresight planar scanner and the 60 ft x 60 ft vertical antenna boresight plant scanner (with a 60 GHz frequency and D/lamdba = 3000 electrical size capabilities), will meet future program testing requirements.
DE LUCIA, A.; PASTORE, V.; BRACCI LAUDIERO, L.; BUONISSIMO, I.; RICCI, G.
2016-01-01
SUMMARY Programmes for early childhood childhood hearing impairment identification allows to quickly start the appropriate hearing aid fitting and rehabilitation process; nevertheless, a large number of patients do not join the treatment program. The goal of this article is to present the results of a strategic review of the strengths, weaknesses, opportunities and threats connected with the audiologic/prosthetic/language follow-up process of children with bilateral permanent hearing impairment. Involving small children, the follow-up includes the involvement of specialised professionals of a multidisciplinary team and a complex and prolonged multi-faced management. Within the framework of the Italian Ministry of Health project CCM 2013 "Preventing Communication Disorders: a Regional Program for Early Identification, Intervention and Care of Hearing Impaired Children", the purpose of this analysis was to propose recommendations that can harmonise criteria for outcome evaluation and provide guidance on the most appropriate assessment methods to be used in the follow-up course of children with permanent hearing impairment. PMID:27054392
Nuclear envelope and genome interactions in cell fate
Talamas, Jessica A.; Capelson, Maya
2015-01-01
The eukaryotic cell nucleus houses an organism’s genome and is the location within the cell where all signaling induced and development-driven gene expression programs are ultimately specified. The genome is enclosed and separated from the cytoplasm by the nuclear envelope (NE), a double-lipid membrane bilayer, which contains a large variety of trans-membrane and associated protein complexes. In recent years, research regarding multiple aspects of the cell nucleus points to a highly dynamic and coordinated concert of efforts between chromatin and the NE in regulation of gene expression. Details of how this concert is orchestrated and how it directs cell differentiation and disease are coming to light at a rapid pace. Here we review existing and emerging concepts of how interactions between the genome and the NE may contribute to tissue specific gene expression programs to determine cell fate. PMID:25852741
Ground Operations Aerospace Language (GOAL). Volume 3: Data bank
NASA Technical Reports Server (NTRS)
1973-01-01
The GOAL (Ground Operations Aerospace Language) test programming language was developed for use in ground checkout operations in a space vehicle launch environment. To insure compatibility with a maximum number of applications, a systematic and error-free method of referencing command/response (analog and digital) hardware measurements is a principle feature of the language. Central to the concept of requiring the test language to be independent of launch complex equipment and terminology is that of addressing measurements via symbolic names that have meaning directly in the hardware units being tested. To form the link from test program through test system interfaces to the units being tested the concept of a data bank has been introduced. The data bank is actually a large cross-reference table that provides pertinent hardware data such as interface unit addresses, data bus routings, or any other system values required to locate and access measurements.
Beowulf Distributed Processing and the United States Geological Survey
Maddox, Brian G.
2002-01-01
Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing technology. It will describe the benefits of the technology. Real data about a distributed application will be presented as an example of the benefits that this technology can bring to USGS scientific programs. Finally, some of the issues with distributed processing that relate to USGS work will be discussed.
Data systems and computer science: Software Engineering Program
NASA Technical Reports Server (NTRS)
Zygielbaum, Arthur I.
1991-01-01
An external review of the Integrated Technology Plan for the Civil Space Program is presented. This review is specifically concerned with the Software Engineering Program. The goals of the Software Engineering Program are as follows: (1) improve NASA's ability to manage development, operation, and maintenance of complex software systems; (2) decrease NASA's cost and risk in engineering complex software systems; and (3) provide technology to assure safety and reliability of software in mission critical applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blattner, J.W.; Bramble, G.M.
1994-06-01
Armed with more than 120 investigative agents, the US Environmental Protection Agency, through its attorneys at the Dept. of Justice, charges 5 to 10 engineers and business people with criminal violations of the nation's environmental regulations in any given week. There are some 10,000 pages of federal (let alone state) environmental regulations. The rules apply to large and small companies alike. As a practical matter, the sheer scope and complexity of environmental regulatory programs make 100% compliance virtually unattainable for most industrial enterprises. Where it is no longer a defense to claim lack of knowledge of one's regulatory obligations, andmore » where courts allow the inference of criminal knowledge based on what the defendant should have known, what is a company to do The environmental audit provides a solution to this problem. Progressive audit programs are established with three goals in mind: to ensure that programs and practices at facilities are in compliance with applicable rules and regulations; to affirm that management systems are in place at the facilities to support ongoing compliance; and to identify needs or opportunities where it may be desirable to go beyond compliance to protect human health and the environment. This paper discusses the implementation of an audit program.« less
Coordinated Care Management For Dementia In A Large, Academic Health System
Tan, Zaldy S.; Jennings, Lee; Reuben, David
2014-01-01
Alzheimer’s disease and other dementias are chronic, incurable diseases that require coordinated care that addresses the medical, behavioral, and social aspects of the disease. With funding from the Center for Medicare and Medicaid Innovation (the Innovation Center), we launched a dementia care program in which a nurse practitioner acting as a dementia care manager worked with primary care physicians to develop and implement a dementia care plan that offers training and support to caregivers, manages care transitions, and facilitates access to community-based services. Post-visit surveys showed high levels of caregiver satisfaction. As program enrollment grows, outcomes will be tracked based on the triple aim developed by the Institute for Healthcare Improvement and adopted by the Centers for Medicare and Medicaid Services: better care, better health, and lower cost and utilization. The program, if successful at achieving the triple aim, may serve as a national model for how dementia and other chronic diseases can be managed in partnership with primary care practices. The program may also inform policy and reimbursement decisions for the recently released transitional care management codes and the complex chronic care management codes to be released by Medicare in 2015. PMID:24711323
Bergin, Michael
2011-01-01
Qualitative data analysis is a complex process and demands clear thinking on the part of the analyst. However, a number of deficiencies may obstruct the research analyst during the process, leading to inconsistencies occurring. This paper is a reflection on the use of a qualitative data analysis program, NVivo 8, and its usefulness in identifying consistency and inconsistency during the coding process. The author was conducting a large-scale study of providers and users of mental health services in Ireland. He used NVivo 8 to store, code and analyse the data and this paper reflects some of his observations during the study. The demands placed on the analyst in trying to balance the mechanics of working through a qualitative data analysis program, while simultaneously remaining conscious of the value of all sources are highlighted. NVivo 8 as a qualitative data analysis program is a challenging but valuable means for advancing the robustness of qualitative research. Pitfalls can be avoided during analysis by running queries as the analyst progresses from tree node to tree node rather than leaving it to a stage whereby data analysis is well advanced.
Reflector surface distortion analysis techniques (thermal distortion analysis of antennas in space)
NASA Technical Reports Server (NTRS)
Sharp, R.; Liao, M.; Giriunas, J.; Heighway, J.; Lagin, A.; Steinbach, R.
1989-01-01
A group of large computer programs are used to predict the farfield antenna pattern of reflector antennas in the thermal environment of space. Thermal Radiation Analysis Systems (TRASYS) is a thermal radiation analyzer that interfaces with Systems Improved Numerical Differencing Analyzer (SINDA), a finite difference thermal analysis program. The programs linked together for this analysis can now be used to predict antenna performance in the constantly changing space environment. They can be used for very complex spacecraft and antenna geometries. Performance degradation caused by methods of antenna reflector construction and materials selection are also taken into consideration. However, the principal advantage of using this program linkage is to account for distortions caused by the thermal environment of space and the hygroscopic effects of the dry-out of graphite/epoxy materials after the antenna is placed into orbit. The results of this type of analysis could ultimately be used to predict antenna reflector shape versus orbital position. A phased array antenna distortion compensation system could then use this data to make RF phase front corrections. That is, the phase front could be adjusted to account for the distortions in the antenna feed and reflector geometry for a particular orbital position.
Economic value evaluation in disease management programs.
Magnezi, Racheli; Reicher, Sima; Shani, Mordechai
2008-05-01
Chronic disease management has been a rapidly growing entity in the 21st century as a strategy for managing chronic illnesses in large populations. However, experience has shown that disease management programs have not been able to demonstrate their financial value. The objectives of disease management programs are to create quality benchmarks, such as principles and guidelines, and to establish a uniform set of metrics and a standardized methodology for evaluating them. In order to illuminate the essence of disease management and its components, as well as the complexity and the problematic nature of performing economic calculations of their profitability and value, we collected data from several reports that dealt with the economic intervention of disease management programs. The disease management economic evaluation is composed of a series of steps, including the following major categories: data/information technology, information generation, assessment/recommendations, actionable customer plans, and program assessment/reassessment. We demonstrate the elements necessary for economic analysis. Disease management is one of the most innovative tools in the managed care environment and is still in the process of being defined. Therefore, objectives should include the creation of quality measures, such as principles and guidelines, and the establishment of a uniform set of metrics and a standardized methodology for evaluating them.
Computer Technology for Industry
NASA Technical Reports Server (NTRS)
1979-01-01
In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.
Quantum communication complexity advantage implies violation of a Bell inequality
Buhrman, Harry; Czekaj, Łukasz; Grudka, Andrzej; Horodecki, Michał; Horodecki, Paweł; Markiewicz, Marcin; Speelman, Florian; Strelchuk, Sergii
2016-01-01
We obtain a general connection between a large quantum advantage in communication complexity and Bell nonlocality. We show that given any protocol offering a sufficiently large quantum advantage in communication complexity, there exists a way of obtaining measurement statistics that violate some Bell inequality. Our main tool is port-based teleportation. If the gap between quantum and classical communication complexity can grow arbitrarily large, the ratio of the quantum value to the classical value of the Bell quantity becomes unbounded with the increase in the number of inputs and outputs. PMID:26957600
NASA Astrophysics Data System (ADS)
Bezruczko, N.; Stanley, T.; Battle, M.; Latty, C.
2016-11-01
Despite broad sweeping pronouncements by international research organizations that social sciences are being integrated into global research programs, little attention has been directed toward obstacles blocking productive collaborations. In particular, social sciences routinely implement nonlinear, ordinal measures, which fundamentally inhibit integration with overarching scientific paradigms. The widely promoted general linear model in contemporary social science methods is largely based on untransformed scores and ratings, which are neither objective nor linear. This issue has historically separated physical and social sciences, which this report now asserts is unnecessary. In this research, nonlinear, subjective caregiver ratings of confidence to care for children supported by complex, medical technologies were transformed to an objective scale defined by logits (N=70). Transparent linear units from this transformation provided foundational insights into measurement properties of a social- humanistic caregiving construct, which clarified physical and social caregiver implications. Parameterized items and ratings were also subjected to multivariate hierarchical analysis, then decomposed to demonstrate theoretical coherence (R2 >.50), which provided further support for convergence of mathematical parameterization, physical expectations, and a social-humanistic construct. These results present substantial support for improving integration of social sciences with contemporary scientific research programs by emphasizing construction of common variables with objective, linear units.
Numerical Modeling of Propellant Boil-Off in a Cryogenic Storage Tank
NASA Technical Reports Server (NTRS)
Majumdar, A. K.; Steadman, T. E.; Maroney, J. L.; Sass, J. P.; Fesmire, J. E.
2007-01-01
A numerical model to predict boil-off of stored propellant in large spherical cryogenic tanks has been developed. Accurate prediction of tank boil-off rates for different thermal insulation systems was the goal of this collaboration effort. The Generalized Fluid System Simulation Program, integrating flow analysis and conjugate heat transfer for solving complex fluid system problems, was used to create the model. Calculation of tank boil-off rate requires simultaneous simulation of heat transfer processes among liquid propellant, vapor ullage space, and tank structure. The reference tank for the boil-off model was the 850,000 gallon liquid hydrogen tank at Launch Complex 39B (LC- 39B) at Kennedy Space Center, which is under study for future infrastructure improvements to support the Constellation program. The methodology employed in the numerical model was validated using a sub-scale model and tank. Experimental test data from a 1/15th scale version of the LC-39B tank using both liquid hydrogen and liquid nitrogen were used to anchor the analytical predictions of the sub-scale model. Favorable correlations between sub-scale model and experimental test data have provided confidence in full-scale tank boil-off predictions. These methods are now being used in the preliminary design for other cases including future launch vehicles
Heavy ligand atom induced large magnetic anisotropy in Mn(ii) complexes.
Chowdhury, Sabyasachi Roy; Mishra, Sabyashachi
2017-06-28
In the search for single molecule magnets, metal ions are considered pivotal towards achieving large magnetic anisotropy barriers. In this context, the influence of ligands with heavy elements, showing large spin-orbit coupling, on magnetic anisotropy barriers was investigated using a series of Mn(ii)-based complexes, in which the metal ion did not have any orbital contribution. The mixing of metal and ligand orbitals was achieved by explicitly correlating the metal and ligand valence electrons with CASSCF calculations. The CASSCF wave functions were further used for evaluating spin-orbit coupling and zero-field splitting parameters for these complexes. For Mn(ii) complexes with heavy ligand atoms, such as Br and I, several interesting inter-state mixings occur via the spin-orbit operator, which results in large magnetic anisotropy in these Mn(ii) complexes.
A high performance cost-effective digital complex correlator for an X-band polarimetry survey.
Bergano, Miguel; Rocha, Armando; Cupido, Luís; Barbosa, Domingos; Villela, Thyrso; Boas, José Vilas; Rocha, Graça; Smoot, George F
2016-01-01
The detailed knowledge of the Milky Way radio emission is important to characterize galactic foregrounds masking extragalactic and cosmological signals. The update of the global sky models describing radio emissions over a very large spectral band requires high sensitivity experiments capable of observing large sky areas with long integration times. Here, we present the design of a new 10 GHz (X-band) polarimeter digital back-end to map the polarization components of the galactic synchrotron radiation field of the Northern Hemisphere sky. The design follows the digital processing trends in radio astronomy and implements a large bandwidth (1 GHz) digital complex cross-correlator to extract the Stokes parameters of the incoming synchrotron radiation field. The hardware constraints cover the implemented VLSI hardware description language code and the preliminary results. The implementation is based on the simultaneous digitized acquisition of the Cartesian components of the two linear receiver polarization channels. The design strategy involves a double data rate acquisition of the ADC interleaved parallel bus, and field programmable gate array device programming at the register transfer mode. The digital core of the back-end is capable of processing 32 Gbps and is built around an Altera field programmable gate array clocked at 250 MHz, 1 GSps analog to digital converters and a clock generator. The control of the field programmable gate array internal signal delays and a convenient use of its phase locked loops provide the timing requirements to achieve the target bandwidths and sensitivity. This solution is convenient for radio astronomy experiments requiring large bandwidth, high functionality, high volume availability and low cost. Of particular interest, this correlator was developed for the Galactic Emission Mapping project and is suitable for large sky area polarization continuum surveys. The solutions may also be adapted to be used at signal processing subsystem levels for large projects like the square kilometer array testbeds.
Muhlfeld, Clint C.; Marotz, Brian
2005-01-01
Despite the importance of large-scale habitat connectivity to the threatened bull trout Salvelinus confluentus, little is known about the life history characteristics and processes influencing natural dispersal of migratory populations. We used radiotelemetry to investigate the seasonal movements and habitat use by subadult bull trout (i.e., fish that emigrated from natal streams to the river system) tracked for varying durations from 1999 to 2002 in the upper Flathead River system in northwestern Montana. Telemetry data revealed migratory (N = 32 fish) and nonmigratory (N = 35 fish) behavior, indicating variable movement patterns in the subadult phase of bull trout life history. Most migrating subadults (84%) made rapid or incremental downriver movements (mean distance, 33 km; range, 6–129 km) to lower portions of the river system and to Flathead Lake during high spring flows and as temperatures declined in the fall and winter. Bull trout subadults used complex daytime habitat throughout the upper river system, including deep runs that contained unembedded boulder and cobble substrates, pools with large woody debris, and deep lake-influenced areas of the lower river system. Our results elucidate the importance of maintaining natural connections and a diversity of complex habitats over a large spatial scale to conserve the full expression of life history traits and processes influencing the natural dispersal of bull trout populations. Managers should seek to restore and enhance critical river corridor habitat and remove migration barriers, where possible, for recovery and management programs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, F.; Stec, B; Pop, C
The death inducing signalling complex (DISC) formed by Fas receptor, FADD (Fas-associated death domain protein) and caspase 8 is a pivotal trigger of apoptosis1, 2, 3. The Fas-FADD DISC represents a receptor platform, which once assembled initiates the induction of programmed cell death. A highly oligomeric network of homotypic protein interactions comprised of the death domains of Fas and FADD is at the centre of DISC formation4, 5. Thus, characterizing the mechanistic basis for the Fas-FADD interaction is crucial for understanding DISC signalling but has remained unclear largely because of a lack of structural data. We have successfully formed andmore » isolated the human Fas-FADD death domain complex and report the 2.7 A crystal structure. The complex shows a tetrameric arrangement of four FADD death domains bound to four Fas death domains. We show that an opening of the Fas death domain exposes the FADD binding site and simultaneously generates a Fas-Fas bridge. The result is a regulatory Fas-FADD complex bridge governed by weak protein-protein interactions revealing a model where the complex itself functions as a mechanistic switch. This switch prevents accidental DISC assembly, yet allows for highly processive DISC formation and clustering upon a sufficient stimulus. In addition to depicting a previously unknown mode of death domain interactions, these results further uncover a mechanism for receptor signalling solely by oligomerization and clustering events.« less
ZMOTTO- MODELING THE INTERNAL COMBUSTION ENGINE
NASA Technical Reports Server (NTRS)
Zeleznik, F. J.
1994-01-01
The ZMOTTO program was developed to model mathematically a spark-ignited internal combustion engine. ZMOTTO is a large, general purpose program whose calculations can be established at five levels of sophistication. These five models range from an ideal cycle requiring only thermodynamic properties, to a very complex representation demanding full combustion kinetics, transport properties, and poppet valve flow characteristics. ZMOTTO is a flexible and computationally economical program based on a system of ordinary differential equations for cylinder-averaged properties. The calculations assume that heat transfer is expressed in terms of a heat transfer coefficient and that the cylinder average of kinetic plus potential energies remains constant. During combustion, the pressures of burned and unburned gases are assumed equal and their heat transfer areas are assumed proportional to their respective mass fractions. Even the simplest ZMOTTO model provides for residual gas effects, spark advance, exhaust gas recirculation, supercharging, and throttling. In the more complex models, 1) finite rate chemistry replaces equilibrium chemistry in descriptions of both the flame and the burned gases, 2) poppet valve formulas represent fluid flow instead of a zero pressure drop flow, and 3) flame propagation is modeled by mass burning equations instead of as an instantaneous process. Input to ZMOTTO is determined by the model chosen. Thermodynamic data is required for all models. Transport properties and chemical kinetics data are required only as the model complexity grows. Other input includes engine geometry, working fluid composition, operating characteristics, and intake/exhaust data. ZMOTTO accommodates a broad spectrum of reactants. The program will calculate many Otto cycle performance parameters for a number of consecutive cycles (a cycle being an interval of 720 crankangle degrees). A typical case will have a number of initial ideal cycles and progress through levels of nonideal cycles. ZMOTTO has restart capabilities and permits multicycle calculations with parameters varying from cycle to cycle. ZMOTTO is written in FORTRAN IV (IBM Level H) but has also been compiled with IBM VSFORTRAN (1977 standard). It was developed on an IBM 3033 under the TSS operating system and has also been implemented under MVS. Approximately 412K of 8 bit bytes of central memory are required in a nonpaging environment. ZMOTTO was developed in 1985.
Complex of simian virus 40 large-T antigen and host 53,000-molecular-weight protein in monkey cells.
Harlow, E; Pim, D C; Crawford, L V
1981-01-01
Mouse cells transformed by simian virus 40 (SV40) have been shown to contain a complex of the virus-coded large-T antigen with a host 53,000-molecular-weight (53K) protein. Initial attempts to detect a similar complex in lytically infected cells were unsuccessful, and it therefore seemed that the complex might be peculiar to transformed or abortively transformed nonpermissive cells. Immunoprecipitation of [32P]phosphate-labeled extracts of SV40-infected CV-1 African green monkey kidney cells with antibodies specific for large-T or the 53K protein revealed that the large-T-53K protein complex was formed during lytic infections. Only a minor fraction of the large-T present was associated with 53K protein, and large-T and the 53K host protein cosedimented during centrifugation through sucrose gradients. We used monospecific sera and monoclonal antibodies to study the rate of synthesis and phosphorylation of the 53K protein during lytic infections. Infection of CV-1 cells with SV40 increased the rate of synthesis of the 53K protein fivefold over that in mock-infected cells. At the same time, the rate of phosphorylation of the 53K protein increased more than 30-fold compared with control cultures. Monkey cells transformed by UV-irradiated SV40 (Gluzman et al., J. Virol. 22:256-266, 1977) also contained the large-T-53K protein complex. The formation of the complex is therefore not a peculiarity of SV40-transformed rodent cells but is a common feature of SV40 infections. Images PMID:6163871
Singh, Prafull Kumar; Roukounakis, Aristomenis; Frank, Daniel O; Kirschnek, Susanne; Das, Kushal Kumar; Neumann, Simon; Madl, Josef; Römer, Winfried; Zorzin, Carina; Borner, Christoph; Haimovici, Aladin; Garcia-Saez, Ana; Weber, Arnim; Häcker, Georg
2017-09-01
The Bcl-2 family protein Bim triggers mitochondrial apoptosis. Bim is expressed in nonapoptotic cells at the mitochondrial outer membrane, where it is activated by largely unknown mechanisms. We found that Bim is regulated by formation of large protein complexes containing dynein light chain 1 (DLC1). Bim rapidly inserted into cardiolipin-containing membranes in vitro and recruited DLC1 to the membrane. Bim binding to DLC1 induced the formation of large Bim complexes on lipid vesicles, on isolated mitochondria, and in intact cells. Native gel electrophoresis and gel filtration showed Bim-containing mitochondrial complexes of several hundred kilodaltons in all cells tested. Bim unable to form complexes was consistently more active than complexed Bim, which correlated with its substantially reduced binding to anti-apoptotic Bcl-2 proteins. At endogenous levels, Bim surprisingly bound only anti-apoptotic Mcl-1 but not Bcl-2 or Bcl-X L , recruiting only Mcl-1 into large complexes. Targeting of DLC1 by RNAi in human cell lines induced disassembly of Bim-Mcl-1 complexes and the proteasomal degradation of Mcl-1 and sensitized the cells to the Bcl-2/Bcl-X L inhibitor ABT-737. Regulation of apoptosis at mitochondria thus extends beyond the interaction of monomers of proapoptotic and anti-apoptotic Bcl-2 family members but involves more complex structures of proteins at the mitochondrial outer membrane, and targeting complexes may be a novel therapeutic strategy. © 2017 Singh et al.; Published by Cold Spring Harbor Laboratory Press.
Program for User-Friendly Management of Input and Output Data Sets
NASA Technical Reports Server (NTRS)
Klimeck, Gerhard
2003-01-01
A computer program manages large, hierarchical sets of input and output (I/O) parameters (typically, sequences of alphanumeric data) involved in computational simulations in a variety of technological disciplines. This program represents sets of parameters as structures coded in object-oriented but otherwise standard American National Standards Institute C language. Each structure contains a group of I/O parameters that make sense as a unit in the simulation program with which this program is used. The addition of options and/or elements to sets of parameters amounts to the addition of new elements to data structures. By association of child data generated in response to a particular user input, a hierarchical ordering of input parameters can be achieved. Associated with child data structures are the creation and description mechanisms within the parent data structures. Child data structures can spawn further child data structures. In this program, the creation and representation of a sequence of data structures is effected by one line of code that looks for children of a sequence of structures until there are no more children to be found. A linked list of structures is created dynamically and is completely represented in the data structures themselves. Such hierarchical data presentation can guide users through otherwise complex setup procedures and it can be integrated within a variety of graphical representations.
Python for Large-Scale Electrophysiology
Spacek, Martin; Blanche, Tim; Swindale, Nicholas
2008-01-01
Electrophysiology is increasingly moving towards highly parallel recording techniques which generate large data sets. We record extracellularly in vivo in cat and rat visual cortex with 54-channel silicon polytrodes, under time-locked visual stimulation, from localized neuronal populations within a cortical column. To help deal with the complexity of generating and analysing these data, we used the Python programming language to develop three software projects: one for temporally precise visual stimulus generation (“dimstim”); one for electrophysiological waveform visualization and spike sorting (“spyke”); and one for spike train and stimulus analysis (“neuropy”). All three are open source and available for download (http://swindale.ecc.ubc.ca/code). The requirements and solutions for these projects differed greatly, yet we found Python to be well suited for all three. Here we present our software as a showcase of the extensive capabilities of Python in neuroscience. PMID:19198646
NASA Astrophysics Data System (ADS)
Pariser, O.; Calef, F.; Manning, E. M.; Ardulov, V.
2017-12-01
We will present implementation and study of several use-cases of utilizing Virtual Reality (VR) for immersive display, interaction and analysis of large and complex 3D datasets. These datasets have been acquired by the instruments across several Earth, Planetary and Solar Space Robotics Missions. First, we will describe the architecture of the common application framework that was developed to input data, interface with VR display devices and program input controllers in various computing environments. Tethered and portable VR technologies will be contrasted and advantages of each highlighted. We'll proceed to presenting experimental immersive analytics visual constructs that enable augmentation of 3D datasets with 2D ones such as images and statistical and abstract data. We will conclude by presenting comparative analysis with traditional visualization applications and share the feedback provided by our users: scientists and engineers.
NASA Astrophysics Data System (ADS)
Doyle, Martin W.; Singh, Jai; Lave, Rebecca; Robertson, Morgan M.
2015-07-01
We use geomorphic surveys to quantify the differences between restored and nonrestored streams, and the difference between streams restored for market purposes (compensatory mitigation) from those restored for nonmarket programs. We also analyze the social and political-economic drivers of the stream restoration and mitigation industry using analysis of policy documents and interviews with key personnel including regulators, mitigation bankers, stream designers, and scientists. Restored streams are typically wider and geomorphically more homogenous than nonrestored streams. Streams restored for the mitigation market are typically headwater streams and part of a large, complex of long restored main channels, and many restored tributaries; streams restored for nonmarket purposes are typically shorter and consist of the main channel only. Interviews reveal that designers integrate many influences including economic and regulatory constraints, but traditions of practice have a large influence as well. Thus, social forces shape the morphology of restored streams.
Multi-Object Spectroscopy with MUSE
NASA Astrophysics Data System (ADS)
Kelz, A.; Kamann, S.; Urrutia, T.; Weilbacher, P.; Bacon, R.
2016-10-01
Since 2014, MUSE, the Multi-Unit Spectroscopic Explorer, is in operation at the ESO-VLT. It combines a superb spatial sampling with a large wavelength coverage. By design, MUSE is an integral-field instrument, but its field-of-view and large multiplex make it a powerful tool for multi-object spectroscopy too. Every data-cube consists of 90,000 image-sliced spectra and 3700 monochromatic images. In autumn 2014, the observing programs with MUSE have commenced, with targets ranging from distant galaxies in the Hubble Deep Field to local stellar populations, star formation regions and globular clusters. This paper provides a brief summary of the key features of the MUSE instrument and its complex data reduction software. Some selected examples are given, how multi-object spectroscopy for hundreds of continuum and emission-line objects can be obtained in wide, deep and crowded fields with MUSE, without the classical need for any target pre-selection.
Medeiros, Daniel Meulemans; Crump, J. Gage
2012-01-01
Patterning of the vertebrate facial skeleton involves the progressive partitioning of neural-crest-derived skeletal precursors into distinct subpopulations along the anteroposterior (AP) and dorsoventral (DV) axes. Recent evidence suggests that complex interactions between multiple signaling pathways, in particular Endothelin-1 (Edn1), Bone Morphogenetic Protein (BMP), and Jagged-Notch, are needed to pattern skeletal precursors along the DV axis. Rather than directly determining the morphology of individual skeletal elements, these signals appear to act through several families of transcription factors, including Dlx, Msx, and Hand, to establish dynamic zones of skeletal differentiation. Provocatively, this patterning mechanism is largely conserved from mouse and zebrafish to the jawless vertebrate, lamprey. This implies that the diversification of the vertebrate facial skeleton, including the evolution of the jaw, was driven largely by modifications downstream of a conversed pharyngeal DV patterning program. PMID:22960284
#Learning: The use of back channel technology in multi-campus nursing education.
Yates, Karen; Birks, Melanie; Woods, Cindy; Hitchins, Marnie
2015-09-01
This paper reports on the results of a study into the use of microblogging technology (TodaysMeet) in large, multi-site lectures in a nursing program. The aim of this study was to investigate students' use of the technology and their perceptions of its value in stimulating engagement in a complex learning environment. The study demonstrated that students like the anonymity that the technology provided, allowing them to ask questions without fear of appearing less competent than their peers. Many of the respondents commented positively on the opportunity to engage with students and the lecturer at other campuses. While some students appreciated the opportunity to interact and have feedback from peers, others saw this as a negative aspect of the technology. This study suggests that, used appropriately, microblogging can be incorporated into large lectures to promote student participation and engagement and ultimately enhance the learning process. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Vos, Lynn
2013-01-01
This article looks at the curriculum redesign of a master's-level program in international marketing from a UK perspective. In order to ensure that the program would be more fit-for-purpose for future managers working under conditions of complexity, uncertainty, and within regimes often very different from the home market, the team began the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brickstad, B.; Bergman, M.
A computerized procedure has been developed that predicts the growth of an initial circumferential surface crack through a pipe and further on to failure. The crack growth mechanism can either be fatigue or stress corrosion. Consideration is taken to complex crack shapes and for the through-wall cracks, crack opening areas and leak rates are also calculated. The procedure is based on a large number of three-dimensional finite element calculations of cracked pipes. The results from these calculations are stored in a database from which the PC-program, denoted LBBPIPE, reads all necessary information. In this paper, a sensitivity analysis is presentedmore » for cracked pipes subjected to both stress corrosion and vibration fatigue.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. W. Allendorf; B. W. Bellow; R. f. Boehm
Three low-pressure rocket motor propellant burn tests were performed in a large, sealed test chamber located at the X-tunnel complex on the Department of Energy's Nevada Test Site in the period May--June 1997. NIKE rocket motors containing double base propellant were used in two tests (two and four motors, respectively), and the third test used two improved HAWK rocket motors containing composite propellant. The preliminary containment safety calculations, the crack and burn procedures used in each test, and the results of various measurements made during and after each test are all summarized and collected in this document.
Lystrom, David J.
1972-01-01
Various methods of verifying real-time streamflow data are outlined in part II. Relatively large errors (those greater than 20-30 percent) can be detected readily by use of well-designed verification programs for a digital computer, and smaller errors can be detected only by discharge measurements and field observations. The capability to substitute a simulated discharge value for missing or erroneous data is incorporated in some of the verification routines described. The routines represent concepts ranging from basic statistical comparisons to complex watershed modeling and provide a selection from which real-time data users can choose a suitable level of verification.
Human-Robot Teaming in a Multi-Agent Space Assembly Task
NASA Technical Reports Server (NTRS)
Rehnmark, Fredrik; Currie, Nancy; Ambrose, Robert O.; Culbert, Christopher
2004-01-01
NASA's Human Space Flight program depends heavily on spacewalks performed by pairs of suited human astronauts. These Extra-Vehicular Activities (EVAs) are severely restricted in both duration and scope by consumables and available manpower. An expanded multi-agent EVA team combining the information-gathering and problem-solving skills of humans with the survivability and physical capabilities of robots is proposed and illustrated by example. Such teams are useful for large-scale, complex missions requiring dispersed manipulation, locomotion and sensing capabilities. To study collaboration modalities within a multi-agent EVA team, a 1-g test is conducted with humans and robots working together in various supporting roles.
BlueSNP: R package for highly scalable genome-wide association studies using Hadoop clusters.
Huang, Hailiang; Tata, Sandeep; Prill, Robert J
2013-01-01
Computational workloads for genome-wide association studies (GWAS) are growing in scale and complexity outpacing the capabilities of single-threaded software designed for personal computers. The BlueSNP R package implements GWAS statistical tests in the R programming language and executes the calculations across computer clusters configured with Apache Hadoop, a de facto standard framework for distributed data processing using the MapReduce formalism. BlueSNP makes computationally intensive analyses, such as estimating empirical p-values via data permutation, and searching for expression quantitative trait loci over thousands of genes, feasible for large genotype-phenotype datasets. http://github.com/ibm-bioinformatics/bluesnp
A computer simulator for development of engineering system design methodologies
NASA Technical Reports Server (NTRS)
Padula, S. L.; Sobieszczanski-Sobieski, J.
1987-01-01
A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.
Materials @ LANL: Solutions for National Security Challenges
NASA Astrophysics Data System (ADS)
Teter, David
2012-10-01
Materials science activities impact many programmatic missions at LANL including nuclear weapons, nuclear energy, renewable energy, global security and nonproliferation. An overview of the LANL materials science strategy and examples of materials science programs will be presented. Major materials leadership areas are in materials dynamics, actinides and correlated electron materials, materials in radiation extremes, energetic materials, integrated nanomaterials and complex functional materials. Los Alamos is also planning a large-scale, signature science facility called MaRIE (Matter Radiation Interactions in Extremes) to address in-situ characterization of materials in dynamic and radiation environments using multiple high energy probes. An overview of this facility will also be presented.
NASA and the U.S. climate program - A problem in data management
NASA Technical Reports Server (NTRS)
Quann, J. J.
1978-01-01
NASA's contribution to the total data base for the National Climate Plan will be to produce climate data sets from its experimental space observing systems and to maximize the value of these data for climate analysis and prediction. Validated data sets will be provided to NOAA for inclusion into their overall diagnostic data base. NASA data management for the Climate Plan will involve: (1) cataloging and retrieval of large integrated and distributed data sets upon user demand, and (2) the storage equivalent of 100,000 digital data tapes. It will be the largest, most complex data system ever developed by NASA
Aerial photo shows RLV complex at KSC
NASA Technical Reports Server (NTRS)
2000-01-01
This closeup photo shows the Reusable Launch Vehicle (RLV) Support Complex at Kennedy Space Center. At right is a multi- purpose hangar and to the left is a building for related ground support equipment and administrative/ technical support. The complex is situated at the Shuttle Landing Facility. The RLV complex will be available to accommodate the Space Shuttle; the X-34 RLV technology demonstrator; the L-1011 carrier aircraft for Pegasus and X-34; and other RLV and X-vehicle programs. The complex is jointly funded by the Spaceport Florida Authority, NASA's Space Shuttle Program and KSC.
NASA Astrophysics Data System (ADS)
Chuvashov, I. N.
2011-07-01
In this paper complex of algorithms and programs for solving inverse problems of artificial earth satellite dynamics is described. Complex has been intended for satellite orbit improvement, calculation of motion model parameters and etc. Programs complex has been worked up for cluster "Skiff Cyberia". Results of numerical experiments obtained by using new complex in common the program "Numerical model of the system artificial satellites motion" is presented in this paper.
Application of the generalized reduced gradient method to conceptual aircraft design
NASA Technical Reports Server (NTRS)
Gabriele, G. A.
1984-01-01
The complete aircraft design process can be broken into three phases of increasing depth: conceptual design, preliminary design, and detail design. Conceptual design consists primarily of developing general arrangements and selecting the configuration that optimally satisfies all mission requirements. The result of the conceptual phase is a conceptual baseline configuration that serves as the starting point for the preliminary design phase. The conceptual design of an aircraft involves a complex trade-off of many independent variables that must be investigated before deciding upon the basic configuration. Some of these variables are discrete (number of engines), some represent different configurations (canard vs conventional tail) and some may represent incorporation of new technologies (aluminum vs composite materials). At Lockheed-Georgia, the sizing program is known as GASP (Generalized Aircraft Sizing Program). GASP is a large program containing analysis modules covering the many different disciplines involved fin defining the aricraft, such as aerodynamics, structures, stability and control, mission performance, and cost. These analysis modules provide first-level estimates the aircraft properties that are derived from handbook, experimental, and historical sources.
Status of DSMT research program
NASA Technical Reports Server (NTRS)
Mcgowan, Paul E.; Javeed, Mehzad; Edighoffer, Harold H.
1991-01-01
The status of the Dynamic Scale Model Technology (DSMT) research program is presented. DSMT is developing scale model technology for large space structures as part of the Control Structure Interaction (CSI) program at NASA Langley Research Center (LaRC). Under DSMT a hybrid-scale structural dynamics model of Space Station Freedom was developed. Space Station Freedom was selected as the focus structure for DSMT since the station represents the first opportunity to obtain flight data on a complex, three-dimensional space structure. Included is an overview of DSMT including the development of the space station scale model and the resulting hardware. Scaling technology was developed for this model to achieve a ground test article which existing test facilities can accommodate while employing realistically scaled hardware. The model was designed and fabricated by the Lockheed Missile and Space Co., and is assembled at LaRc for dynamic testing. Also, results from ground tests and analyses of the various model components are presented along with plans for future subassembly and matted model tests. Finally, utilization of the scale model for enhancing analysis verification of the full-scale space station is also considered.
A historical perspective of the YF-12A thermal loads and structures program
NASA Technical Reports Server (NTRS)
Jenkins, Jerald M.; Quinn, Robert D.
1996-01-01
Around 1970, the Y-F-12A loads and structures efforts focused on numerous technological issues that needed defining with regard to aircraft that incorporate hot structures in the design. Laboratory structural heating test technology with infrared systems was largely created during this program. The program demonstrated the ability to duplicate the complex flight temperatures of an advanced supersonic airplane in a ground-based laboratory. The ability to heat and load an advanced operational aircraft in a laboratory at high temperatures and return it to flight status without adverse effects was demonstrated. The technology associated with measuring loads with strain gages on a hot structure was demonstrated with a thermal calibration concept. The results demonstrated that the thermal stresses were significant although the airplane was designed to reduce thermal stresses. Considerable modeling detail was required to predict the heat transfer and the corresponding structural characteristics. The overall YF-12A research effort was particularly productive, and a great deal of flight, laboratory, test and computational data were produced and cross-correlated.
Haji, Faizal A; Da Silva, Celina; Daigle, Delton T; Dubrowski, Adam
2014-08-01
Presently, health care simulation research is largely conducted on a study-by-study basis. Although such "project-based" research generates a plethora of evidence, it can be chaotic and contradictory. A move toward sustained, thematic, theory-based programs of research is necessary to advance knowledge in the field. Recognizing that simulation is a complex intervention, we present a framework for developing research programs in simulation-based education adapted from the Medical Research Council (MRC) guidance. This framework calls for an iterative approach to developing, refining, evaluating, and implementing simulation interventions. The adapted framework guidance emphasizes: (1) identification of theory and existing evidence; (2) modeling and piloting interventions to clarify active ingredients and identify mechanisms linking the context, intervention, and outcomes; and (3) evaluation of intervention processes and outcomes in both the laboratory and real-world setting. The proposed framework will aid simulation researchers in developing more robust interventions that optimize simulation-based education and advance our understanding of simulation pedagogy.
Streamflow prediction using multi-site rainfall obtained from hydroclimatic teleconnection
NASA Astrophysics Data System (ADS)
Kashid, S. S.; Ghosh, Subimal; Maity, Rajib
2010-12-01
SummarySimultaneous variations in weather and climate over widely separated regions are commonly known as "hydroclimatic teleconnections". Rainfall and runoff patterns, over continents, are found to be significantly teleconnected, with large-scale circulation patterns, through such hydroclimatic teleconnections. Though such teleconnections exist in nature, it is very difficult to model them, due to their inherent complexity. Statistical techniques and Artificial Intelligence (AI) tools gain popularity in modeling hydroclimatic teleconnection, based on their ability, in capturing the complicated relationship between the predictors (e.g. sea surface temperatures) and predictand (e.g., rainfall). Genetic Programming is such an AI tool, which is capable of capturing nonlinear relationship, between predictor and predictand, due to its flexible functional structure. In the present study, gridded multi-site weekly rainfall is predicted from El Niño Southern Oscillation (ENSO) indices, Equatorial Indian Ocean Oscillation (EQUINOO) indices, Outgoing Longwave Radiation (OLR) and lag rainfall at grid points, over the catchment, using Genetic Programming. The predicted rainfall is further used in a Genetic Programming model to predict streamflows. The model is applied for weekly forecasting of streamflow in Mahanadi River, India, and satisfactory performance is observed.
Monitoring challenges and innovative ideas
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Neill, R.V.; Hunsaker, C.T.; Levine, D.A.
1990-01-01
Monitoring programs are difficult to design even when they focus on specific problems. Ecosystems are complex, and it is often impossible to predetermine what aspects of system structure or dynamics will respond to a specific insult. It is equally difficult to interpret whether a response is a stabilizing compensatory mechanism or a real loss of capacity to maintain the ecosystem. The problems are compounded in a broad monitoring program designed to assess ecosystem health'' at regional and continental scales. It is challenging in the extreme to monitor ecosystem response, at any scale, to past insults as well as an unknownmore » future array of impacts. The present paper will examine some of the fundamental issues and challenges raised by large-scale monitoring efforts. The challenges will serve as a framework and as an excuse to discuss several important topics in more detail. Following the discussion of challenges, we suggest some basic innovations that could be important across a range of monitoring programs. The innovations include integrative measures, innovative methodology, and creative interpretation. 59 refs., 1 tab.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This report describes results of a technical, financial and environmental assessment study for a project, which would have included a new TCS micronized coal-fired heating plant for the Produkcja I Hodowla Roslin Ogrodniczych (PHRO) Greenhouse Complex; Krzeszowice, Poland. Project site is about 20 miles west of Krakow, Poland. During the project study period, PHRO utilized 14 heavy oil-fired boilers to produce heat for its greenhouse facilities and also home heating to several adjacent apartment housing complexes. The boilers burn a high-sulfur content heavy crude oil, called mazute, The project study was conducted during a period extended from March 1996 throughmore » February 1997. For size orientation, the PHRO Greenhouse complex grows a variety of vegetables and flowers for the Southern Poland marketplace. The greenhouse area under glass is very large and equivalent to approximately 50 football fields, The new micronized coal fired boiler would have: (1) provided a significant portion of the heat for PHRO and a portion of the adjacent apartment housing complexes, (2) dramatically reduced sulfur dioxide air pollution emissions, while satisfying new Polish air regulations, and (3) provided attractive savings to PHRO, based on the quantity of displaced oil.« less
Plant Phenotyping through the Eyes of Complex Systems: Theoretical Considerations
NASA Astrophysics Data System (ADS)
Kim, J.
2017-12-01
Plant phenotyping is an emerging transdisciplinary research which necessitates not only the communication and collaboration of scientists from different disciplines but also the paradigm shift to a holistic approach. Complex system is defined as a system having a large number of interacting parts (or particles, agents), whose interactions give rise to non-trivial properties like self-organization and emergence. Plant ecosystems are complex systems which are continually morphing dynamical systems, i.e. self-organizing hierarchical open systems. Such systems are composed of many subunits/subsystems with nonlinear interactions and feedback. The throughput such as the flow of energy, matter and information is the key control parameter in complex systems. Information theoretic approaches can be used to understand and identify such interactions, structures and dynamics through reductions in uncertainty (i.e. entropy). The theoretical considerations based on network and thermodynamic thinking and exemplary analyses (e.g. dynamic process network, spectral entropy) of the throughput time series will be presented. These can be used as a framework to develop more discipline-specific fundamental approaches to provide tools for the transferability of traits between measurement scales in plant phenotyping. Acknowledgment: This work was funded by the Weather Information Service Engine Program of the Korea Meteorological Administration under Grant KMIPA-2012-0001.