Sample records for zone modeling software

  1. VISUAL PLUMES MIXING ZONE MODELING SOFTWARE

    EPA Science Inventory

    The U.S. Environmental Protection Agency has a long history of both supporting plume model development and providing mixing zone modeling software. The Visual Plumes model is the most recent addition to the suite of public-domain models available through the EPA-Athens Center f...

  2. Federal Highway Administration (FHWA) work zone driver model software

    DOT National Transportation Integrated Search

    2016-11-01

    FHWA and the U.S. Department of Transportation (USDOT) Volpe Center are developing a work zone car-following model and simulation software that interfaces with existing microsimulation tools, enabling more accurate simulation of car-following through...

  3. Evaluation of work zone enhancement software programs.

    DOT National Transportation Integrated Search

    2009-09-01

    The Missouri Department of Transportation (MoDOT) is looking for software tools that can assist in : developing effective plans to manage and communicate work zone activities. QuickZone, CA4PRS, : VISSIM, and Spreadsheet models are the tools that MoD...

  4. COMPILATION OF SATURATED AND UNSATURATED ZONE MODELING SOFTWARE

    EPA Science Inventory

    The full report provides readers an overview of available ground-water modeling programs and related software. It is an update of EPA/600/R-93/118 and EPA/600/R-94/028, two previous reports from the same program at the International Ground Water Modeling Center (IGWMC) in Colora...

  5. Calibration of work zone impact analysis software for Missouri.

    DOT National Transportation Integrated Search

    2013-12-01

    This project calibrated two software programs used for estimating the traffic impacts of work zones. The WZ Spreadsheet : and VISSIM programs were recommended in a previous study by the authors. The two programs were calibrated using : field data fro...

  6. The community-driven BiG CZ software system for integration and analysis of bio- and geoscience data in the critical zone

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Mayorga, E.; Horsburgh, J. S.; Lehnert, K. A.; Zaslavsky, I.; Valentine, D. W., Jr.; Richard, S. M.; Cheetham, R.; Meyer, F.; Henry, C.; Berg-Cross, G.; Packman, A. I.; Aronson, E. L.

    2014-12-01

    Here we present the prototypes of a new scientific software system designed around the new Observations Data Model version 2.0 (ODM2, https://github.com/UCHIC/ODM2) to substantially enhance integration of biological and Geological (BiG) data for Critical Zone (CZ) science. The CZ science community takes as its charge the effort to integrate theory, models and data from the multitude of disciplines collectively studying processes on the Earth's surface. The central scientific challenge of the CZ science community is to develop a "grand unifying theory" of the critical zone through a theory-model-data fusion approach, for which the key missing need is a cyberinfrastructure for seamless 4D visual exploration of the integrated knowledge (data, model outputs and interpolations) from all the bio and geoscience disciplines relevant to critical zone structure and function, similar to today's ability to easily explore historical satellite imagery and photographs of the earth's surface using Google Earth. This project takes the first "BiG" steps toward answering that need. The overall goal of this project is to co-develop with the CZ science and broader community, including natural resource managers and stakeholders, a web-based integration and visualization environment for joint analysis of cross-scale bio and geoscience processes in the critical zone (BiG CZ), spanning experimental and observational designs. We will: (1) Engage the CZ and broader community to co-develop and deploy the BiG CZ software stack; (2) Develop the BiG CZ Portal web application for intuitive, high-performance map-based discovery, visualization, access and publication of data by scientists, resource managers, educators and the general public; (3) Develop the BiG CZ Toolbox to enable cyber-savvy CZ scientists to access BiG CZ Application Programming Interfaces (APIs); and (4) Develop the BiG CZ Central software stack to bridge data systems developed for multiple critical zone domains into a single

  7. Phase II, improved work zone design guidelines and enhanced model of traffic delays in work zones : executive summary report.

    DOT National Transportation Integrated Search

    2009-03-01

    This project contains three major parts. In the first part a digital computer simulation model was developed with the aim to model the traffic through a freeway work zone situation. The model was based on the Arena simulation software and used cumula...

  8. Phase II, improved work zone design guidelines and enhanced model of traffic delays in work zones : final report, March 2009.

    DOT National Transportation Integrated Search

    2009-03-01

    This project contains three major parts. In the first part a digital computer simulation model was developed with the aim to model the traffic through a freeway work zone situation. The model was based on the Arena simulation software and used cumula...

  9. Integrated software for the detection of epileptogenic zones in refractory epilepsy.

    PubMed

    Mottini, Alejandro; Miceli, Franco; Albin, Germán; Nuñez, Margarita; Ferrándo, Rodolfo; Aguerrebere, Cecilia; Fernandez, Alicia

    2010-01-01

    In this paper we present an integrated software designed to help nuclear medicine physicians in the detection of epileptogenic zones (EZ) by means of ictal-interictal SPECT and MR images. This tool was designed to be flexible, friendly and efficient. A novel detection method was included (A-contrario) along with the classical detection method (Subtraction analysis). The software's performance was evaluated with two separate sets of validation studies: visual interpretation of 12 patient images by an experimented observer and objective analysis of virtual brain phantom experiments by proposed numerical observers. Our results support the potential use of the proposed software to help nuclear medicine physicians in the detection of EZ in clinical practice.

  10. Evaluation of using digital gravity field models for zoning map creation

    NASA Astrophysics Data System (ADS)

    Loginov, Dmitry

    2018-05-01

    At the present time the digital cartographic models of geophysical fields are taking a special significance into geo-physical mapping. One of the important directions to their application is the creation of zoning maps, which allow taking into account the morphology of geophysical field in the implementation automated choice of contour intervals. The purpose of this work is the comparative evaluation of various digital models in the creation of integrated gravity field zoning map. For comparison were chosen the digital model of gravity field of Russia, created by the analog map with scale of 1 : 2 500 000, and the open global model of gravity field of the Earth - WGM2012. As a result of experimental works the four integrated gravity field zoning maps were obtained with using raw and processed data on each gravity field model. The study demonstrates the possibility of open data use to create integrated zoning maps with the condition to eliminate noise component of model by processing in specialized software systems. In this case, for solving problem of contour intervals automated choice the open digital models aren't inferior to regional models of gravity field, created for individual countries. This fact allows asserting about universality and independence of integrated zoning maps creation regardless of detail of a digital cartographic model of geo-physical fields.

  11. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  12. Software cost/resource modeling: Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. J.

    1980-01-01

    A parametric software cost estimation model prepared for JPL deep space network (DSN) data systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models, such as those of the General Research Corporation, Doty Associates, IBM (Walston-Felix), Rome Air Force Development Center, University of Maryland, and Rayleigh-Norden-Putnam. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software lifecycle statistics. The estimation model output scales a standard DSN work breakdown structure skeleton, which is then input to a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.

  13. Mushy zone modeling

    NASA Astrophysics Data System (ADS)

    Glicksman, Martin E.; Smith, Richard N.; Marsh, Steven P.; Kuklinski, Robert

    A key element of mushy zone modeling is the description of the microscopic evolution of the lengthscales within the mushy zone and the influence of macroscopic transport processes. This paper describes some recent progress in developing a mean-field statistical theory of phase coarsening in adiabatic mushy zones. The main theoretical predictions are temporal scaling laws that indicate that average lengthscale increases as time 1/3, a self-similar distribution of mushy zone lengthscales based on spherical solid particle shapes, and kinetic rate constants which provide the dependences of the coarsening process on material parameters and the volume fraction of the solid phase. High precision thermal decay experiments are described which verify aspects of the theory in pure material mushy zones held under adiabatic conditions. The microscopic coarsening theory is then integrated within a macroscopic heat transfer model of one-dimensional alloy solidification, using the Double Integral Method. The method demonstrates an ability to predict the influence of macroscopic heat transfer on the evolution of primary and secondary dendrite arm spacings in Al-Cu alloys. Finally, some suggestions are made for future experimental and theoretical studies required in developing comprehensive solidification processing models.

  14. Software Cost-Estimation Model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1985-01-01

    Software Cost Estimation Model SOFTCOST provides automated resource and schedule model for software development. Combines several cost models found in open literature into one comprehensive set of algorithms. Compensates for nearly fifty implementation factors relative to size of task, inherited baseline, organizational and system environment and difficulty of task.

  15. VISUAL PLUMES MIXING ZONE MODELING SOFTWARE

    EPA Science Inventory

    The US Environmental Protection Agency has a history of developing plume models and providing technical assistance. The Visual Plumes model (VP) is a recent addition to the public-domain models available on the EPA Center for Exposure Assessment Modeling (CEAM) web page. The Wind...

  16. Generic domain models in software engineering

    NASA Technical Reports Server (NTRS)

    Maiden, Neil

    1992-01-01

    This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.

  17. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the secondmore » place. 407 refs., 4 figs., 2 tabs.« less

  18. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place.more » 407 refs., 4 figs., 2 tabs.« less

  19. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  20. The site-scale saturated zone flow model for Yucca Mountain: Calibration of different conceptual models and their impact on flow paths

    USGS Publications Warehouse

    Zyvoloski, G.; Kwicklis, E.; Eddebbarh, A.-A.; Arnold, B.; Faunt, C.; Robinson, B.A.

    2003-01-01

    This paper presents several different conceptual models of the Large Hydraulic Gradient (LHG) region north of Yucca Mountain and describes the impact of those models on groundwater flow near the potential high-level repository site. The results are based on a numerical model of site-scale saturated zone beneath Yucca Mountain. This model is used for performance assessment predictions of radionuclide transport and to guide future data collection and modeling activities. The numerical model is calibrated by matching available water level measurements using parameter estimation techniques, along with more informal comparisons of the model to hydrologic and geochemical information. The model software (hydrologic simulation code FEHM and parameter estimation software PEST) and model setup allows for efficient calibration of multiple conceptual models. Until now, the Large Hydraulic Gradient has been simulated using a low-permeability, east-west oriented feature, even though direct evidence for this feature is lacking. In addition to this model, we investigate and calibrate three additional conceptual models of the Large Hydraulic Gradient, all of which are based on a presumed zone of hydrothermal chemical alteration north of Yucca Mountain. After examining the heads and permeabilities obtained from the calibrated models, we present particle pathways from the potential repository that record differences in the predicted groundwater flow regime. The results show that Large Hydraulic Gradient can be represented with the alternate conceptual models that include the hydrothermally altered zone. The predicted pathways are mildly sensitive to the choice of the conceptual model and more sensitive to the quality of calibration in the vicinity on the repository. These differences are most likely due to different degrees of fit of model to data, and do not represent important differences in hydrologic conditions for the different conceptual models. ?? 2002 Elsevier Science B

  1. Mental Models of Software Forecasting

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Griesel, A.; Bruno, K.; Fouser, T.; Tausworthe, R.

    1993-01-01

    The majority of software engineers resist the use of the currently available cost models. One problem is that the mathematical and statistical models that are currently available do not correspond with the mental models of the software engineers. In an earlier JPL funded study (Hihn and Habib-agahi, 1991) it was found that software engineers prefer to use analogical or analogy-like techniques to derive size and cost estimates, whereas curren CER's hide any analogy in the regression equations. In addition, the currently available models depend upon information which is not available during early planning when the most important forecasts must be made.

  2. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  3. Modeling software systems by domains

    NASA Technical Reports Server (NTRS)

    Dippolito, Richard; Lee, Kenneth

    1992-01-01

    The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.

  4. Software development predictors, error analysis, reliability models and software metric analysis

    NASA Technical Reports Server (NTRS)

    Basili, Victor

    1983-01-01

    The use of dynamic characteristics as predictors for software development was studied. It was found that there are some significant factors that could be useful as predictors. From a study on software errors and complexity, it was shown that meaningful results can be obtained which allow insight into software traits and the environment in which it is developed. Reliability models were studied. The research included the field of program testing because the validity of some reliability models depends on the answers to some unanswered questions about testing. In studying software metrics, data collected from seven software engineering laboratory (FORTRAN) projects were examined and three effort reporting accuracy checks were applied to demonstrate the need to validate a data base. Results are discussed.

  5. Approximate Model of Zone Sedimentation

    NASA Astrophysics Data System (ADS)

    Dzianik, František

    2011-12-01

    The process of zone sedimentation is affected by many factors that are not possible to express analytically. For this reason, the zone settling is evaluated in practice experimentally or by application of an empirical mathematical description of the process. The paper presents the development of approximate model of zone settling, i.e. the general function which should properly approximate the behaviour of the settling process within its entire range and at the various conditions. Furthermore, the specification of the model parameters by the regression analysis of settling test results is shown. The suitability of the model is reviewed by graphical dependencies and by statistical coefficients of correlation. The approximate model could by also useful on the simplification of process design of continual settling tanks and thickeners.

  6. Software Validation via Model Animation

    NASA Technical Reports Server (NTRS)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  7. Testing Software Development Project Productivity Model

    NASA Astrophysics Data System (ADS)

    Lipkin, Ilya

    Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control

  8. NASA Software Cost Estimation Model: An Analogy Based Estimation Model

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James

    2015-01-01

    The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K-­ nearest neighbor prediction model performance on the same data set.

  9. Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Jet PRopulsion Laboratory (JPL) Deep Space Network (DSN) Data System implementation tasks is described. The resource estimation mdel modifies and combines a number of existing models. The model calibrates the task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software life-cycle statistics.

  10. Modeling work zone crash frequency by quantifying measurement errors in work zone length.

    PubMed

    Yang, Hong; Ozbay, Kaan; Ozturk, Ozgur; Yildirimoglu, Mehmet

    2013-06-01

    Work zones are temporary traffic control zones that can potentially cause safety problems. Maintaining safety, while implementing necessary changes on roadways, is an important challenge traffic engineers and researchers have to confront. In this study, the risk factors in work zone safety evaluation were identified through the estimation of a crash frequency (CF) model. Measurement errors in explanatory variables of a CF model can lead to unreliable estimates of certain parameters. Among these, work zone length raises a major concern in this analysis because it may change as the construction schedule progresses generally without being properly documented. This paper proposes an improved modeling and estimation approach that involves the use of a measurement error (ME) model integrated with the traditional negative binomial (NB) model. The proposed approach was compared with the traditional NB approach. Both models were estimated using a large dataset that consists of 60 work zones in New Jersey. Results showed that the proposed improved approach outperformed the traditional approach in terms of goodness-of-fit statistics. Moreover it is shown that the use of the traditional NB approach in this context can lead to the overestimation of the effect of work zone length on the crash occurrence. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Understanding software faults and their role in software reliability modeling

    NASA Technical Reports Server (NTRS)

    Munson, John C.

    1994-01-01

    This study is a direct result of an on-going project to model the reliability of a large real-time control avionics system. In previous modeling efforts with this system, hardware reliability models were applied in modeling the reliability behavior of this system. In an attempt to enhance the performance of the adapted reliability models, certain software attributes were introduced in these models to control for differences between programs and also sequential executions of the same program. As the basic nature of the software attributes that affect software reliability become better understood in the modeling process, this information begins to have important implications on the software development process. A significant problem arises when raw attribute measures are to be used in statistical models as predictors, for example, of measures of software quality. This is because many of the metrics are highly correlated. Consider the two attributes: lines of code, LOC, and number of program statements, Stmts. In this case, it is quite obvious that a program with a high value of LOC probably will also have a relatively high value of Stmts. In the case of low level languages, such as assembly language programs, there might be a one-to-one relationship between the statement count and the lines of code. When there is a complete absence of linear relationship among the metrics, they are said to be orthogonal or uncorrelated. Usually the lack of orthogonality is not serious enough to affect a statistical analysis. However, for the purposes of some statistical analysis such as multiple regression, the software metrics are so strongly interrelated that the regression results may be ambiguous and possibly even misleading. Typically, it is difficult to estimate the unique effects of individual software metrics in the regression equation. The estimated values of the coefficients are very sensitive to slight changes in the data and to the addition or deletion of variables in the

  12. Validation and Verification of LADEE Models and Software

    NASA Technical Reports Server (NTRS)

    Gundy-Burlet, Karen

    2013-01-01

    The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds.

  13. Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Deep Space Network (DSN) Data Systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit DSN software life cycle statistics. The estimation model output scales a standard DSN Work Breakdown Structure skeleton, which is then input into a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.

  14. The Pedestrian Evacuation Analyst: geographic information systems software for modeling hazard evacuation potential

    USGS Publications Warehouse

    Jones, Jeanne M.; Ng, Peter; Wood, Nathan J.

    2014-01-01

    Recent disasters such as the 2011 Tohoku, Japan, earthquake and tsunami; the 2013 Colorado floods; and the 2014 Oso, Washington, mudslide have raised awareness of catastrophic, sudden-onset hazards that arrive within minutes of the events that trigger them, such as local earthquakes or landslides. Due to the limited amount of time between generation and arrival of sudden-onset hazards, evacuations are typically self-initiated, on foot, and across the landscape (Wood and Schmidtlein, 2012). Although evacuation to naturally occurring high ground may be feasible in some vulnerable communities, evacuation modeling has demonstrated that other communities may require vertical-evacuation structures within a hazard zone, such as berms or buildings, if at-risk individuals are to survive some types of sudden-onset hazards (Wood and Schmidtlein, 2013). Researchers use both static least-cost-distance (LCD) and dynamic agent-based models to assess the pedestrian evacuation potential of vulnerable communities. Although both types of models help to understand the evacuation landscape, LCD models provide a more general overview that is independent of population distributions, which may be difficult to quantify given the dynamic spatial and temporal nature of populations (Wood and Schmidtlein, 2012). Recent LCD efforts related to local tsunami threats have focused on an anisotropic (directionally dependent) path distance modeling approach that incorporates travel directionality, multiple travel speed assumptions, and cost surfaces that reflect variations in slope and land cover (Wood and Schmidtlein, 2012, 2013). The Pedestrian Evacuation Analyst software implements this anisotropic path-distance approach for pedestrian evacuation from sudden-onset hazards, with a particular focus at this time on local tsunami threats. The model estimates evacuation potential based on elevation, direction of movement, land cover, and travel speed and creates a map showing travel times to safety (a

  15. Interoperability of Neuroscience Modeling Software

    PubMed Central

    Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik

    2009-01-01

    Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374

  16. A Model for Assessing the Liability of Seemingly Correct Software

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.; Voas, Larry K.; Miller, Keith W.

    1991-01-01

    Current research on software reliability does not lend itself to quantitatively assessing the risk posed by a piece of life-critical software. Black-box software reliability models are too general and make too many assumptions to be applied confidently to assessing the risk of life-critical software. We present a model for assessing the risk caused by a piece of software; this model combines software testing results and Hamlet's probable correctness model. We show how this model can assess software risk for those who insure against a loss that can occur if life-critical software fails.

  17. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    NASA Astrophysics Data System (ADS)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  18. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  19. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  20. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  1. Evaluating Predictive Models of Software Quality

    NASA Astrophysics Data System (ADS)

    Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.

    2014-06-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  2. A New Paradigm For Modeling Fault Zone Inelasticity: A Multiscale Continuum Framework Incorporating Spontaneous Localization and Grain Fragmentation.

    NASA Astrophysics Data System (ADS)

    Elbanna, A. E.

    2015-12-01

    The brittle portion of the crust contains structural features such as faults, jogs, joints, bends and cataclastic zones that span a wide range of length scales. These features may have a profound effect on earthquake nucleation, propagation and arrest. Incorporating these existing features in modeling and the ability to spontaneously generate new one in response to earthquake loading is crucial for predicting seismicity patterns, distribution of aftershocks and nucleation sites, earthquakes arrest mechanisms, and topological changes in the seismogenic zone structure. Here, we report on our efforts in modeling two important mechanisms contributing to the evolution of fault zone topology: (1) Grain comminution at the submeter scale, and (2) Secondary faulting/plasticity at the scale of few to hundreds of meters. We use the finite element software Abaqus to model the dynamic rupture. The constitutive response of the fault zone is modeled using the Shear Transformation Zone theory, a non-equilibrium statistical thermodynamic framework for modeling plastic deformation and localization in amorphous materials such as fault gouge. The gouge layer is modeled as 2D plane strain region with a finite thickness and heterogeenous distribution of porosity. By coupling the amorphous gouge with the surrounding elastic bulk, the model introduces a set of novel features that go beyond the state of the art. These include: (1) self-consistent rate dependent plasticity with a physically-motivated set of internal variables, (2) non-locality that alleviates mesh dependence of shear band formation, (3) spontaneous evolution of fault roughness and its strike which affects ground motion generation and the local stress fields, and (4) spontaneous evolution of grain size and fault zone fabric.

  3. The thermochemical, two-phase dynamics of subduction zones: results from new, fully coupled models

    NASA Astrophysics Data System (ADS)

    Rees Jones, D. W.; Katz, R. F.; May, D.; Tian, M.; Rudge, J. F.

    2017-12-01

    Subduction zones are responsible for most of Earth's subaerial volcanism. However, previous geodynamic modelling of subduction zones has largely neglected magmatism. We previously showed that magmatism has a significant thermal impact, by advecting sensible heat into the lithosphere beneath arc volcanos [1]. Inclusion of this effect helps reconcile subduction zone models with petrological and heat flow observations. Many important questions remain, including how magma-mantle dynamics of subduction zones affects the position of arc volcanos and the character of their lavas. In this presentation, we employ a fully coupled, thermochemical, two-phase flow theory to investigate the dynamics of subduction zones. We present the first results from our new software (SubFUSc), which solves the coupled equations governing conservation of mass, momentum, energy and chemical species. The presence and migration of partial melts affect permeability and mantle viscosity (both directly and through their thermal impact); these, in turn, feed back on the magma-mantle flow. Thus our fully coupled modelling improves upon previous two-phase models that decoupled the governing equations and fixed the thermal structure [2]. To capture phase change, we use a novel, simplified model of the mantle melting in the presence of volatile species. As in the natural system, volatiles are associated with low-degree melting at temperatures beneath the anhydrous solidus; dehydration reactions in the slab supply volatiles into the wedge, triggering silicic melting. We simulate the migration of melts under buoyancy forces and dynamic pressure gradients. We thereby demonstrate the dynamical controls on the pattern of subduction-zone volcanism (particularly its location, magnitude, and chemical composition). We build on our previous study of the thermal consequences of magma genesis and segregation. We address the question of what controls the location of arc volcanoes themselves [3]. [1] Rees Jones, D. W

  4. Overview of the TriBITS Lifecycle Model: Lean/Agile Software Lifecycle Model for Research-based Computational Science and Engineering Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Roscoe A; Heroux, Dr. Michael A; Willenbring, James

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science & engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process--respecting the competing needs of research vs. production--cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects thatmore » are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  5. Dependability modeling and assessment in UML-based software development.

    PubMed

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  6. Dependability Modeling and Assessment in UML-Based Software Development

    PubMed Central

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C.

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results. PMID:22988428

  7. Software Models Impact Stresses

    NASA Technical Reports Server (NTRS)

    Hanshaw, Timothy C.; Roy, Dipankar; Toyooka, Mark

    1991-01-01

    Generalized Impact Stress Software designed to assist engineers in predicting stresses caused by variety of impacts. Program straightforward, simple to implement on personal computers, "user friendly", and handles variety of boundary conditions applied to struck body being analyzed. Applications include mathematical modeling of motions and transient stresses of spacecraft, analysis of slamming of piston, of fast valve shutoffs, and play of rotating bearing assembly. Provides fast and inexpensive analytical tool for analysis of stresses and reduces dependency on expensive impact tests. Written in FORTRAN 77. Requires use of commercial software package PLOT88.

  8. Maximum Entropy Discrimination Poisson Regression for Software Reliability Modeling.

    PubMed

    Chatzis, Sotirios P; Andreou, Andreas S

    2015-11-01

    Reliably predicting software defects is one of the most significant tasks in software engineering. Two of the major components of modern software reliability modeling approaches are: 1) extraction of salient features for software system representation, based on appropriately designed software metrics and 2) development of intricate regression models for count data, to allow effective software reliability data modeling and prediction. Surprisingly, research in the latter frontier of count data regression modeling has been rather limited. More specifically, a lack of simple and efficient algorithms for posterior computation has made the Bayesian approaches appear unattractive, and thus underdeveloped in the context of software reliability modeling. In this paper, we try to address these issues by introducing a novel Bayesian regression model for count data, based on the concept of max-margin data modeling, effected in the context of a fully Bayesian model treatment with simple and efficient posterior distribution updates. Our novel approach yields a more discriminative learning technique, making more effective use of our training data during model inference. In addition, it allows of better handling uncertainty in the modeled data, which can be a significant problem when the training data are limited. We derive elegant inference algorithms for our model under the mean-field paradigm and exhibit its effectiveness using the publicly available benchmark data sets.

  9. Hot 'nough for ya?: Controls and Constraints on modeling flux melting in subduction zones

    NASA Astrophysics Data System (ADS)

    Spiegelman, M.; Wilson, C. R.; van Keken, P.; Kelemen, P. B.; Hacker, B. R.

    2012-12-01

    required ingredients that allow for robust focusing of both fluids and hot solids to the sub-arc regions. We demonstrate coupled fluid/solid flow models for simplified geometries to understand the basic processes, as well as for more geologically relevant models from a range of observed arc geometries. We will also evaluate the efficacy of current wet melting parameterizations in these models. All of these models have been built using new modeling software we have been developing that allows unprecedented flexibility in the composition and solution of coupled multi-physics problems. Dubbed TerraFERMA (the transparent Finite Element Rapid Model Assembler...no relation to the convection code TERRA), this new software leverages several advanced computational libraries (FEniCS/PETSc/Spud) to make it significantly easier to construct and explore a wide range of models of varying complexity. Subduction zones provide an ideal application area for understanding the role of different degrees of coupling of fluid and solid dynamics and their relation to observations.

  10. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  11. HOMER® Energy Modeling Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, Tom

    2000-12-31

    The HOMER® energy modeling software is a tool for designing and analyzing hybrid power systems, which contain a mix of conventional generators, cogeneration, wind turbines, solar photovoltaic, hydropower, batteries, fuel cells, hydropower, biomass and other inputs.

  12. Presenting an evaluation model of the trauma registry software.

    PubMed

    Asadi, Farkhondeh; Paydar, Somayeh

    2018-04-01

    Trauma is a major cause of 10% death in the worldwide and is considered as a global concern. This problem has made healthcare policy makers and managers to adopt a basic strategy in this context. Trauma registry has an important and basic role in decreasing the mortality and the disabilities due to injuries resulted from trauma. Today, different software are designed for trauma registry. Evaluation of this software improves management, increases efficiency and effectiveness of these systems. Therefore, the aim of this study is to present an evaluation model for trauma registry software. The present study is an applied research. In this study, general and specific criteria of trauma registry software were identified by reviewing literature including books, articles, scientific documents, valid websites and related software in this domain. According to general and specific criteria and related software, a model for evaluating trauma registry software was proposed. Based on the proposed model, a checklist designed and its validity and reliability evaluated. Mentioned model by using of the Delphi technique presented to 12 experts and specialists. To analyze the results, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved by the experts and professionals, the final version of the evaluation model for the trauma registry software was presented. For evaluating of criteria of trauma registry software, two groups were presented: 1- General criteria, 2- Specific criteria. General criteria of trauma registry software were classified into four main categories including: 1- usability, 2- security, 3- maintainability, and 4-interoperability. Specific criteria were divided into four main categories including: 1- data submission and entry, 2- reporting, 3- quality control, 4- decision and research support. The presented model in this research has introduced important general and specific criteria of trauma registry software

  13. Residence-time framework for modeling multicomponent reactive transport in stream hyporheic zones

    NASA Astrophysics Data System (ADS)

    Painter, S. L.; Coon, E. T.; Brooks, S. C.

    2017-12-01

    Process-based models for transport and transformation of nutrients and contaminants in streams require tractable representations of solute exchange between the stream channel and biogeochemically active hyporheic zones. Residence-time based formulations provide an alternative to detailed three-dimensional simulations and have had good success in representing hyporheic exchange of non-reacting solutes. We extend the residence-time formulation for hyporheic transport to accommodate general multicomponent reactive transport. To that end, the integro-differential form of previous residence time models is replaced by an equivalent formulation based on a one-dimensional advection dispersion equation along the channel coupled at each channel location to a one-dimensional transport model in Lagrangian travel-time form. With the channel discretized for numerical solution, the associated Lagrangian model becomes a subgrid model representing an ensemble of streamlines that are diverted into the hyporheic zone before returning to the channel. In contrast to the previous integro-differential forms of the residence-time based models, the hyporheic flowpaths have semi-explicit spatial representation (parameterized by travel time), thus allowing coupling to general biogeochemical models. The approach has been implemented as a stream-corridor subgrid model in the open-source integrated surface/subsurface modeling software ATS. We use bedform-driven flow coupled to a biogeochemical model with explicit microbial biomass dynamics as an example to show that the subgrid representation is able to represent redox zonation in sediments and resulting effects on metal biogeochemical dynamics in a tractable manner that can be scaled to reach scales.

  14. Model-based engineering for medical-device software.

    PubMed

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  15. Models and metrics for software management and engineering

    NASA Technical Reports Server (NTRS)

    Basili, V. R.

    1988-01-01

    This paper attempts to characterize and present a state of the art view of several quantitative models and metrics of the software life cycle. These models and metrics can be used to aid in managing and engineering software projects. They deal with various aspects of the software process and product, including resources allocation and estimation, changes and errors, size, complexity and reliability. Some indication is given of the extent to which the various models have been used and the success they have achieved.

  16. Software Engineering Laboratory (SEL) cleanroom process model

    NASA Technical Reports Server (NTRS)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  17. A benchmark for subduction zone modeling

    NASA Astrophysics Data System (ADS)

    van Keken, P.; King, S.; Peacock, S.

    2003-04-01

    Our understanding of subduction zones hinges critically on the ability to discern its thermal structure and dynamics. Computational modeling has become an essential complementary approach to observational and experimental studies. The accurate modeling of subduction zones is challenging due to the unique geometry, complicated rheological description and influence of fluid and melt formation. The complicated physics causes problems for the accurate numerical solution of the governing equations. As a consequence it is essential for the subduction zone community to be able to evaluate the ability and limitations of various modeling approaches. The participants of a workshop on the modeling of subduction zones, held at the University of Michigan at Ann Arbor, MI, USA in 2002, formulated a number of case studies to be developed into a benchmark similar to previous mantle convection benchmarks (Blankenbach et al., 1989; Busse et al., 1991; Van Keken et al., 1997). Our initial benchmark focuses on the dynamics of the mantle wedge and investigates three different rheologies: constant viscosity, diffusion creep, and dislocation creep. In addition we investigate the ability of codes to accurate model dynamic pressure and advection dominated flows. Proceedings of the workshop and the formulation of the benchmark are available at www.geo.lsa.umich.edu/~keken/subduction02.html We strongly encourage interested research groups to participate in this benchmark. At Nice 2003 we will provide an update and first set of benchmark results. Interested researchers are encouraged to contact one of the authors for further details.

  18. Presenting an Evaluation Model for the Cancer Registry Software.

    PubMed

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  19. The instrument control software package for the Habitable-Zone Planet Finder spectrometer

    NASA Astrophysics Data System (ADS)

    Bender, Chad F.; Robertson, Paul; Stefansson, Gudmundur Kari; Monson, Andrew; Anderson, Tyler; Halverson, Samuel; Hearty, Frederick; Levi, Eric; Mahadevan, Suvrath; Nelson, Matthew; Ramsey, Larry; Roy, Arpita; Schwab, Christian; Shetrone, Matthew; Terrien, Ryan

    2016-08-01

    We describe the Instrument Control Software (ICS) package that we have built for The Habitable-Zone Planet Finder (HPF) spectrometer. The ICS controls and monitors instrument subsystems, facilitates communication with the Hobby-Eberly Telescope facility, and provides user interfaces for observers and telescope operators. The backend is built around the asynchronous network software stack provided by the Python Twisted engine, and is linked to a suite of custom hardware communication protocols. This backend is accessed through Python-based command-line and PyQt graphical frontends. In this paper we describe several of the customized subsystem communication protocols that provide access to and help maintain the hardware systems that comprise HPF, and show how asynchronous communication benefits the numerous hardware components. We also discuss our Detector Control Subsystem, built as a set of custom Python wrappers around a C-library that provides native Linux access to the SIDECAR ASIC and Hawaii-2RG detector system used by HPF. HPF will be one of the first astronomical instruments on sky to utilize this native Linux capability through the SIDECAR Acquisition Module (SAM) electronics. The ICS we have created is very flexible, and we are adapting it for NEID, NASA's Extreme Precision Doppler Spectrometer for the WIYN telescope; we will describe this adaptation, and describe the potential for use in other astronomical instruments.

  20. Software Model Checking Without Source Code

    NASA Technical Reports Server (NTRS)

    Chaki, Sagar; Ivers, James

    2009-01-01

    We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.

  1. A viscoplastic shear-zone model for episodic slow slip events in oceanic subduction zones

    NASA Astrophysics Data System (ADS)

    Yin, A.; Meng, L.

    2016-12-01

    Episodic slow slip events occur widely along oceanic subduction zones at the brittle-ductile transition depths ( 20-50 km). Although efforts have been devoted to unravel their mechanical origins, it remains unclear about the physical controls on the wide range of their recurrence intervals and slip durations. In this study we present a simple mechanical model that attempts to account for the observed temporal evolution of slow slip events. In our model we assume that slow slip events occur in a viscoplastic shear zone (i.e., Bingham material), which has an upper static and a lower dynamic plastic yield strength. We further assume that the hanging wall deformation is approximated as an elastic spring. We envision the shear zone to be initially locked during forward/landward motion but is subsequently unlocked when the elastic and gravity-induced stress exceeds the static yield strength of the shear zone. This leads to backward/trenchward motion damped by viscous shear-zone deformation. As the elastic spring progressively loosens, the hanging wall velocity evolves with time and the viscous shear stress eventually reaches the dynamic yield strength. This is followed by the termination of the trenchward motion when the elastic stress is balanced by the dynamic yield strength of the shear zone and the gravity. In order to account for the zig-saw slip-history pattern of typical repeated slow slip events, we assume that the shear zone progressively strengthens after each slow slip cycle, possibly caused by dilatancy as commonly assumed or by progressive fault healing through solution-transport mechanisms. We quantify our conceptual model by obtaining simple analytical solutions. Our model results suggest that the duration of the landward motion increases with the down-dip length and the static yield strength of the shear zone, but decreases with the ambient loading velocity and the elastic modulus of the hanging wall. The duration of the backward/trenchward motion depends

  2. THE EPA MULTIMEDIA INTEGRATED MODELING SYSTEM SOFTWARE SUITE

    EPA Science Inventory

    The U.S. EPA is developing a Multimedia Integrated Modeling System (MIMS) framework that will provide a software infrastructure or environment to support constructing, composing, executing, and evaluating complex modeling studies. The framework will include (1) common software ...

  3. Development of a calibrated software reliability model for flight and supporting ground software for avionic systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Stella

    1991-01-01

    The object of this project was to develop and calibrate quantitative models for predicting the quality of software. Reliable flight and supporting ground software is a highly important factor in the successful operation of the space shuttle program. The models used in the present study consisted of SMERFS (Statistical Modeling and Estimation of Reliability Functions for Software). There are ten models in SMERFS. For a first run, the results obtained in modeling the cumulative number of failures versus execution time showed fairly good results for our data. Plots of cumulative software failures versus calendar weeks were made and the model results were compared with the historical data on the same graph. If the model agrees with actual historical behavior for a set of data then there is confidence in future predictions for this data. Considering the quality of the data, the models have given some significant results, even at this early stage. With better care in data collection, data analysis, recording of the fixing of failures and CPU execution times, the models should prove extremely helpful in making predictions regarding the future pattern of failures, including an estimate of the number of errors remaining in the software and the additional testing time required for the software quality to reach acceptable levels. It appears that there is no one 'best' model for all cases. It is for this reason that the aim of this project was to test several models. One of the recommendations resulting from this study is that great care must be taken in the collection of data. When using a model, the data should satisfy the model assumptions.

  4. Numerical modeling of fracking fluid and methane migration through fault zones in shale gas reservoirs

    NASA Astrophysics Data System (ADS)

    Taherdangkoo, Reza; Tatomir, Alexandru; Sauter, Martin

    2017-04-01

    Hydraulic fracturing operation in shale gas reservoir has gained growing interest over the last few years. Groundwater contamination is one of the most important environmental concerns that have emerged surrounding shale gas development (Reagan et al., 2015). The potential impacts of hydraulic fracturing could be studied through the possible pathways for subsurface migration of contaminants towards overlying aquifers (Kissinger et al., 2013; Myers, 2012). The intent of this study is to investigate, by means of numerical simulation, two failure scenarios which are based on the presence of a fault zone that penetrates the full thickness of overburden and connect shale gas reservoir to aquifer. Scenario 1 addresses the potential transport of fracturing fluid from the shale into the subsurface. This scenario was modeled with COMSOL Multiphysics software. Scenario 2 deals with the leakage of methane from the reservoir into the overburden. The numerical modeling of this scenario was implemented in DuMux (free and open-source software), discrete fracture model (DFM) simulator (Tatomir, 2012). The modeling results are used to evaluate the influence of several important parameters (reservoir pressure, aquifer-reservoir separation thickness, fault zone inclination, porosity, permeability, etc.) that could affect the fluid transport through the fault zone. Furthermore, we determined the main transport mechanisms and circumstances in which would allow frack fluid or methane migrate through the fault zone into geological layers. The results show that presence of a conductive fault could reduce the contaminant travel time and a significant contaminant leakage, under certain hydraulic conditions, is most likely to occur. Bibliography Kissinger, A., Helmig, R., Ebigbo, A., Class, H., Lange, T., Sauter, M., Heitfeld, M., Klünker, J., Jahnke, W., 2013. Hydraulic fracturing in unconventional gas reservoirs: risks in the geological system, part 2. Environ Earth Sci 70, 3855

  5. Modeling and Simulation for a Surf Zone Robot

    DTIC Science & Technology

    2012-12-14

    of-freedom surf zone robot is developed and tested with a physical test platform and with a simulated robot in Robot Operating System . Derived from...terrain. The application of the model to future platforms is analyzed and a broad examination of the current state of surf zone robotic systems is...public release; distribution is unlimited MODELING AND SIMULATION FOR A SURF ZONE ROBOT Eric Shuey Lieutenant, United States Navy B.S., Systems

  6. Engine Structures Modeling Software System (ESMOSS)

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Engine Structures Modeling Software System (ESMOSS) is the development of a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components, and substructures which can be transferred to finite element analysis programs such as NASTRAN. The NASA Lewis Engine Structures Program is concerned with the development of technology for the rational structural design and analysis of advanced gas turbine engines with emphasis on advanced structural analysis, structural dynamics, structural aspects of aeroelasticity, and life prediction. Fundamental and common to all of these developments is the need for geometric and analytical model descriptions at various engine assembly levels which are generated using ESMOSS.

  7. Modeling hyporheic zone processes

    USGS Publications Warehouse

    Runkel, Robert L.; McKnight, Diane M.; Rajaram, Harihar

    2003-01-01

    Stream biogeochemistry is influenced by the physical and chemical processes that occur in the surrounding watershed. These processes include the mass loading of solutes from terrestrial and atmospheric sources, the physical transport of solutes within the watershed, and the transformation of solutes due to biogeochemical reactions. Research over the last two decades has identified the hyporheic zone as an important part of the stream system in which these processes occur. The hyporheic zone may be loosely defined as the porous areas of the stream bed and stream bank in which stream water mixes with shallow groundwater. Exchange of water and solutes between the stream proper and the hyporheic zone has many biogeochemical implications, due to differences in the chemical composition of surface and groundwater. For example, surface waters are typically oxidized environments with relatively high dissolved oxygen concentrations. In contrast, reducing conditions are often present in groundwater systems leading to low dissolved oxygen concentrations. Further, microbial oxidation of organic materials in groundwater leads to supersaturated concentrations of dissolved carbon dioxide relative to the atmosphere. Differences in surface and groundwater pH and temperature are also common. The hyporheic zone is therefore a mixing zone in which there are gradients in the concentrations of dissolved gasses, the concentrations of oxidized and reduced species, pH, and temperature. These gradients lead to biogeochemical reactions that ultimately affect stream water quality. Due to the complexity of these natural systems, modeling techniques are frequently employed to quantify process dynamics.

  8. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for manymore » CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  9. Trajectory Software With Upper Atmosphere Model

    NASA Technical Reports Server (NTRS)

    Barrett, Charles

    2012-01-01

    The Trajectory Software Applications 6.0 for the Dec Alpha platform has an implementation of the Jacchia-Lineberry Upper Atmosphere Density Model used in the Mission Control Center for International Space Station support. Previous trajectory software required an upper atmosphere to support atmosphere drag calculations in the Mission Control Center. The Functional operation will differ depending on the end-use of the module. In general, the calling routine will use function-calling arguments to specify input to the processor. The atmosphere model will then compute and return atmospheric density at the time of interest.

  10. `Dhara': An Open Framework for Critical Zone Modeling

    NASA Astrophysics Data System (ADS)

    Le, P. V.; Kumar, P.

    2016-12-01

    Processes in the Critical Zone, which sustain terrestrial life, are tightly coupled across hydrological, physical, biological, chemical, pedological, geomorphological and ecological domains over both short and long timescales. Observations and quantification of the Earth's surface across these domains using emerging high resolution measurement technologies such as light detection and ranging (lidar) and hyperspectral remote sensing are enabling us to characterize fine scale landscape attributes over large spatial areas. This presents a unique opportunity to develop novel approaches to model the Critical Zone that can capture fine scale intricate dependencies across the different processes in 3D. The development of interdisciplinary tools that transcend individual disciplines and capture new levels of complexity and emergent properties is at the core of Critical Zone science. Here we introduce an open framework for high-performance computing model (`Dhara') for modeling complex processes in the Critical Zone. The framework is designed to be modular in structure with the aim to create uniform and efficient tools to facilitate and leverage process modeling. It also provides flexibility to maintain, collaborate, and co-develop additional components by the scientific community. We show the essential framework that simulates ecohydrologic dynamics, and surface - sub-surface coupling in 3D using hybrid parallel CPU-GPU. We demonstrate that the open framework in Dhara is feasible for detailed, multi-processes, and large-scale modeling of the Critical Zone, which opens up exciting possibilities. We will also present outcomes from a Modeling Summer Institute led by Intensively Managed Critical Zone Observatory (IMLCZO) with representation from several CZOs and international representatives.

  11. Tips on Creating Complex Geometry Using Solid Modeling Software

    ERIC Educational Resources Information Center

    Gow, George

    2008-01-01

    Three-dimensional computer-aided drafting (CAD) software, sometimes referred to as "solid modeling" software, is easy to learn, fun to use, and becoming the standard in industry. However, many users have difficulty creating complex geometry with the solid modeling software. And the problem is not entirely a student problem. Even some teachers and…

  12. Capability Maturity Model (CMM) for Software Process Improvements

    NASA Technical Reports Server (NTRS)

    Ling, Robert Y.

    2000-01-01

    This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.

  13. Bioinactivation: Software for modelling dynamic microbial inactivation.

    PubMed

    Garre, Alberto; Fernández, Pablo S; Lindqvist, Roland; Egea, Jose A

    2017-03-01

    This contribution presents the bioinactivation software, which implements functions for the modelling of isothermal and non-isothermal microbial inactivation. This software offers features such as user-friendliness, modelling of dynamic conditions, possibility to choose the fitting algorithm and generation of prediction intervals. The software is offered in two different formats: Bioinactivation core and Bioinactivation SE. Bioinactivation core is a package for the R programming language, which includes features for the generation of predictions and for the fitting of models to inactivation experiments using non-linear regression or a Markov Chain Monte Carlo algorithm (MCMC). The calculations are based on inactivation models common in academia and industry (Bigelow, Peleg, Mafart and Geeraerd). Bioinactivation SE supplies a user-friendly interface to selected functions of Bioinactivation core, namely the model fitting of non-isothermal experiments and the generation of prediction intervals. The capabilities of bioinactivation are presented in this paper through a case study, modelling the non-isothermal inactivation of Bacillus sporothermodurans. This study has provided a full characterization of the response of the bacteria to dynamic temperature conditions, including confidence intervals for the model parameters and a prediction interval of the survivor curve. We conclude that the MCMC algorithm produces a better characterization of the biological uncertainty and variability than non-linear regression. The bioinactivation software can be relevant to the food and pharmaceutical industry, as well as to regulatory agencies, as part of a (quantitative) microbial risk assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. GeoTess: A generalized Earth model software utility

    DOE PAGES

    Ballard, Sanford; Hipp, James; Kraus, Brian; ...

    2016-03-23

    GeoTess is a model parameterization and software support library that manages the construction, population, storage, and interrogation of data stored in 2D and 3D Earth models. Here, the software is available in Java and C++, with a C interface to the C++ library.

  15. Aspect-Oriented Model-Driven Software Product Line Engineering

    NASA Astrophysics Data System (ADS)

    Groher, Iris; Voelter, Markus

    Software product line engineering aims to reduce development time, effort, cost, and complexity by taking advantage of the commonality within a portfolio of similar products. The effectiveness of a software product line approach directly depends on how well feature variability within the portfolio is implemented and managed throughout the development lifecycle, from early analysis through maintenance and evolution. This article presents an approach that facilitates variability implementation, management, and tracing by integrating model-driven and aspect-oriented software development. Features are separated in models and composed of aspect-oriented composition techniques on model level. Model transformations support the transition from problem to solution space models. Aspect-oriented techniques enable the explicit expression and modularization of variability on model, template, and code level. The presented concepts are illustrated with a case study of a home automation system.

  16. Root zone water quality model (RZWQM2): Model use, calibration and validation

    USGS Publications Warehouse

    Ma, Liwang; Ahuja, Lajpat; Nolan, B.T.; Malone, Robert; Trout, Thomas; Qi, Z.

    2012-01-01

    The Root Zone Water Quality Model (RZWQM2) has been used widely for simulating agricultural management effects on crop production and soil and water quality. Although it is a one-dimensional model, it has many desirable features for the modeling community. This article outlines the principles of calibrating the model component by component with one or more datasets and validating the model with independent datasets. Users should consult the RZWQM2 user manual distributed along with the model and a more detailed protocol on how to calibrate RZWQM2 provided in a book chapter. Two case studies (or examples) are included in this article. One is from an irrigated maize study in Colorado to illustrate the use of field and laboratory measured soil hydraulic properties on simulated soil water and crop production. It also demonstrates the interaction between soil and plant parameters in simulated plant responses to water stresses. The other is from a maize-soybean rotation study in Iowa to show a manual calibration of the model for crop yield, soil water, and N leaching in tile-drained soils. Although the commonly used trial-and-error calibration method works well for experienced users, as shown in the second example, an automated calibration procedure is more objective, as shown in the first example. Furthermore, the incorporation of the Parameter Estimation Software (PEST) into RZWQM2 made the calibration of the model more efficient than a grid (ordered) search of model parameters. In addition, PEST provides sensitivity and uncertainty analyses that should help users in selecting the right parameters to calibrate.

  17. Early experiences building a software quality prediction model

    NASA Technical Reports Server (NTRS)

    Agresti, W. W.; Evanco, W. M.; Smith, M. C.

    1990-01-01

    Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.

  18. Consistent Evolution of Software Artifacts and Non-Functional Models

    DTIC Science & Technology

    2014-11-14

    induce bad software performance)? 15. SUBJECT TERMS EOARD, Nano particles, Photo-Acoustic Sensors, Model-Driven Engineering ( MDE ), Software Performance...Università degli Studi dell’Aquila, Via Vetoio, 67100 L’Aquila, Italy Email: vittorio.cortellessa@univaq.it Web : http: // www. di. univaq. it/ cortelle/ Phone...Model-Driven Engineering ( MDE ), Software Performance Engineering (SPE), Change Propagation, Performance Antipatterns. For sake of readability of the

  19. Sustainable Street Vendors Spatial Zoning Models in Surakarta

    NASA Astrophysics Data System (ADS)

    Rahayu, M. J.; Putri, R. A.; Rini, E. F.

    2018-02-01

    Various strategies that have been carried out by Surakarta’s government to organize street vendors have not achieved the goal of street vendors’ arrangement comprehensively. The street vendors arrangement strategy consists of physical (spatial) and non-physical. One of the physical arrangements is to define the street vendor’s zoning. Based on the street vendors’ characteristics, there are two alternative locations of stabilization (as one kind of street vendors’ arrangement) that can be used. The aim of this study is to examine those alternative locations to set the street vendor’s zoning models. Quatitative method is used to formulate the spatial zoning model. The street vendor’s zoning models are formulated based on two approaches, which are the distance to their residences and previous trading locations. Geographic information system is used to indicate all street vendors’ residences and trading locations based on their type of goods. Through proximity point distance tool on ArcGIS, we find the closeness of residential location and previous trading location with the alternative location of street vendors’ stabilization. The result shows that the location was chosen by the street vendors to sell their goods mainly consider the proximity to their homes. It also shows street vendor’s zoning models which based on the type of street vendor’s goods.

  20. A software quality model and metrics for risk assessment

    NASA Technical Reports Server (NTRS)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  1. Data-driven traffic impact assessment tool for work zones.

    DOT National Transportation Integrated Search

    2017-03-01

    Traditionally, traffic impacts of work zones have been assessed using planning software such as Quick Zone, custom spreadsheets, and others. These software programs generate delay, queuing, and other mobility measures but are difficult to validate du...

  2. HOMER® Energy Modeling Software 2003

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, Tom

    2003-12-31

    The HOMER® energy modeling software is a tool for designing and analyzing hybrid power systems, which contain a mix of conventional generators, cogeneration, wind turbines, solar photovoltaic, hydropower, batteries, fuel cells, hydropower, biomass and other inputs.

  3. One-dimensional flow model of the river-hyporheic zone system

    NASA Astrophysics Data System (ADS)

    Pokrajac, D.

    2016-12-01

    The hyporheic zone is a shallow layer beneath natural streams that is characterized by intense exchange of water, nutrients, pollutants and thermal energy. Understanding these exchange processes is crucial for successful modelling of the river hydrodynamics and morphodynamics at various scales from the river corridor up to the river network scale (Cardenas, 2015). Existing simulation models of hyporheic exchange processes are either idealized models of the tracer movement through the river-hyporheic zone system (e.g. TSM, Bencala and Walters, 1983) or detailed models of turbulent flow in a stream, coupled with a conventional 2D Darcian groundwater model (e.g. Cardenas and Wilson, 2007). This paper presents an alternative approach which involves a simple 1-D simulation model of the hyporheic zone system based on the classical SWE equations coupled with the newly developed porous media analogue. This allows incorporating the effects of flow unsteadiness and non-Darcian parameterization od the drag term in the hyporheic zone model. The conceptual model of the stream-hyporheic zone system consists of a 1D model of the open channel flow in the river, coupled with a 1D model of the flow in the hyporheic zone via volume flux due to the difference in the water level in the river and the hyporheic zone. The interaction with the underlying groundwater aquifer is neglected, but coupling the present model with any conventional groundwater model is straightforward. The paper presents the derivation of the 1D flow equations for flow in the hyporheic zone, the details of the numerical scheme used for solving them and the model validation by comparison with published experimental data. References Bencala, K. E., and R. A. Walters (1983) "Simulation of solute transport in a mountain pool-and-riffle stream- a transient storage model", Water Resources Reseach 19(3): 718-724. Cardenas, M. B. (2015) "Hyporheic zone hydrologic science: A historical account of its emergence and a

  4. Model for Simulating a Spiral Software-Development Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  5. Software to model AXAF-I image quality

    NASA Technical Reports Server (NTRS)

    Ahmad, Anees; Feng, Chen

    1995-01-01

    A modular user-friendly computer program for the modeling of grazing-incidence type x-ray optical systems has been developed. This comprehensive computer software GRAZTRACE covers the manipulation of input data, ray tracing with reflectivity and surface deformation effects, convolution with x-ray source shape, and x-ray scattering. The program also includes the capabilities for image analysis, detector scan modeling, and graphical presentation of the results. A number of utilities have been developed to interface the predicted Advanced X-ray Astrophysics Facility-Imaging (AXAF-I) mirror structural and thermal distortions with the ray-trace. This software is written in FORTRAN 77 and runs on a SUN/SPARC station. An interactive command mode version and a batch mode version of the software have been developed.

  6. Development and application of new quality model for software projects.

    PubMed

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  7. Stochastic Ground Water Flow Simulation with a Fracture Zone Continuum Model

    USGS Publications Warehouse

    Langevin, C.D.

    2003-01-01

    A method is presented for incorporating the hydraulic effects of vertical fracture zones into two-dimensional cell-based continuum models of ground water flow and particle tracking. High hydraulic conductivity features are used in the model to represent fracture zones. For fracture zones that are not coincident with model rows or columns, an adjustment is required for the hydraulic conductivity value entered into the model cells to compensate for the longer flowpath through the model grid. A similar adjustment is also required for simulated travel times through model cells. A travel time error of less than 8% can occur for particles moving through fractures with certain orientations. The fracture zone continuum model uses stochastically generated fracture zone networks and Monte Carlo analysis to quantify uncertainties with simulated advective travel times. An approach is also presented for converting an equivalent continuum model into a fracture zone continuum model by establishing the contribution of matrix block transmissivity to the bulk transmissivity of the aquifer. The methods are used for a case study in west-central Florida to quantify advective travel times from a potential wetland rehydration site to a municipal supply wellfield. Uncertainties in advective travel times are assumed to result from the presence of vertical fracture zones, commonly observed on aerial photographs as photolineaments.

  8. Development and Application of New Quality Model for Software Projects

    PubMed Central

    Karnavel, K.; Dillibabu, R.

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594

  9. Atomistic Cohesive Zone Models for Interface Decohesion in Metals

    NASA Technical Reports Server (NTRS)

    Yamakov, Vesselin I.; Saether, Erik; Glaessgen, Edward H.

    2009-01-01

    Using a statistical mechanics approach, a cohesive-zone law in the form of a traction-displacement constitutive relationship characterizing the load transfer across the plane of a growing edge crack is extracted from atomistic simulations for use within a continuum finite element model. The methodology for the atomistic derivation of a cohesive-zone law is presented. This procedure can be implemented to build cohesive-zone finite element models for simulating fracture in nanocrystalline or ultrafine grained materials.

  10. Development of an Environment for Software Reliability Model Selection

    DTIC Science & Technology

    1992-09-01

    now is directed to other related problems such as tools for model selection, multiversion programming, and software fault tolerance modeling... multiversion programming, 7. Hlardware can be repaired by spare modules, which is not. the case for software, 2-6 N. Preventive maintenance is very important

  11. NCAR global model topography generation software for unstructured grids

    NASA Astrophysics Data System (ADS)

    Lauritzen, P. H.; Bacmeister, J. T.; Callaghan, P. F.; Taylor, M. A.

    2015-06-01

    It is the purpose of this paper to document the NCAR global model topography generation software for unstructured grids. Given a model grid, the software computes the fraction of the grid box covered by land, the gridbox mean elevation, and associated sub-grid scale variances commonly used for gravity wave and turbulent mountain stress parameterizations. The software supports regular latitude-longitude grids as well as unstructured grids; e.g. icosahedral, Voronoi, cubed-sphere and variable resolution grids. As an example application and in the spirit of documenting model development, exploratory simulations illustrating the impacts of topographic smoothing with the NCAR-DOE CESM (Community Earth System Model) CAM5.2-SE (Community Atmosphere Model version 5.2 - Spectral Elements dynamical core) are shown.

  12. Software cost/resource modeling: Software quality tradeoff measurement

    NASA Technical Reports Server (NTRS)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  13. Software Assurance Competency Model

    DTIC Science & Technology

    2013-03-01

    COTS) software , and software as a service ( SaaS ). L2: Define and analyze risks in the acquisition of contracted software , COTS software , and SaaS ...2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...

  14. The software-cycle model for re-engineering and reuse

    NASA Technical Reports Server (NTRS)

    Bailey, John W.; Basili, Victor R.

    1992-01-01

    This paper reports on the progress of a study which will contribute to our ability to perform high-level, component-based programming by describing means to obtain useful components, methods for the configuration and integration of those components, and an underlying economic model of the costs and benefits associated with this approach to reuse. One goal of the study is to develop and demonstrate methods to recover reusable components from domain-specific software through a combination of tools, to perform the identification, extraction, and re-engineering of components, and domain experts, to direct the applications of those tools. A second goal of the study is to enable the reuse of those components by identifying techniques for configuring and recombining the re-engineered software. This component-recovery or software-cycle model addresses not only the selection and re-engineering of components, but also their recombination into new programs. Once a model of reuse activities has been developed, the quantification of the costs and benefits of various reuse options will enable the development of an adaptable economic model of reuse, which is the principal goal of the overall study. This paper reports on the conception of the software-cycle model and on several supporting techniques of software recovery, measurement, and reuse which will lead to the development of the desired economic model.

  15. DEVELOPMENT OF A PORTABLE SOFTWARE LANGUAGE FOR PHYSIOLOGICALLY-BASED PHARMACOKINETIC (PBPK) MODELS

    EPA Science Inventory

    The PBPK modeling community has had a long-standing problem with modeling software compatibility. The numerous software packages used for PBPK models are, at best, minimally compatible. This creates problems ranging from model obsolescence due to software support discontinuation...

  16. A bridge role metric model for nodes in software networks.

    PubMed

    Li, Bo; Feng, Yanli; Ge, Shiyu; Li, Dashe

    2014-01-01

    A bridge role metric model is put forward in this paper. Compared with previous metric models, our solution of a large-scale object-oriented software system as a complex network is inherently more realistic. To acquire nodes and links in an undirected network, a new model that presents the crucial connectivity of a module or the hub instead of only centrality as in previous metric models is presented. Two previous metric models are described for comparison. In addition, it is obvious that the fitting curve between the Bre results and degrees can well be fitted by a power law. The model represents many realistic characteristics of actual software structures, and a hydropower simulation system is taken as an example. This paper makes additional contributions to an accurate understanding of module design of software systems and is expected to be beneficial to software engineering practices.

  17. A Bridge Role Metric Model for Nodes in Software Networks

    PubMed Central

    Li, Bo; Feng, Yanli; Ge, Shiyu; Li, Dashe

    2014-01-01

    A bridge role metric model is put forward in this paper. Compared with previous metric models, our solution of a large-scale object-oriented software system as a complex network is inherently more realistic. To acquire nodes and links in an undirected network, a new model that presents the crucial connectivity of a module or the hub instead of only centrality as in previous metric models is presented. Two previous metric models are described for comparison. In addition, it is obvious that the fitting curve between the results and degrees can well be fitted by a power law. The model represents many realistic characteristics of actual software structures, and a hydropower simulation system is taken as an example. This paper makes additional contributions to an accurate understanding of module design of software systems and is expected to be beneficial to software engineering practices. PMID:25364938

  18. SOFTCOST - DEEP SPACE NETWORK SOFTWARE COST MODEL

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1994-01-01

    The early-on estimation of required resources and a schedule for the development and maintenance of software is usually the least precise aspect of the software life cycle. However, it is desirable to make some sort of an orderly and rational attempt at estimation in order to plan and organize an implementation effort. The Software Cost Estimation Model program, SOFTCOST, was developed to provide a consistent automated resource and schedule model which is more formalized than the often used guesswork model based on experience, intuition, and luck. SOFTCOST was developed after the evaluation of a number of existing cost estimation programs indicated that there was a need for a cost estimation program with a wide range of application and adaptability to diverse kinds of software. SOFTCOST combines several software cost models found in the open literature into one comprehensive set of algorithms that compensate for nearly fifty implementation factors relative to size of the task, inherited baseline, organizational and system environment, and difficulty of the task. SOFTCOST produces mean and variance estimates of software size, implementation productivity, recommended staff level, probable duration, amount of computer resources required, and amount and cost of software documentation. Since the confidence level for a project using mean estimates is small, the user is given the opportunity to enter risk-biased values for effort, duration, and staffing, to achieve higher confidence levels. SOFTCOST then produces a PERT/CPM file with subtask efforts, durations, and precedences defined so as to produce the Work Breakdown Structure (WBS) and schedule having the asked-for overall effort and duration. The SOFTCOST program operates in an interactive environment prompting the user for all of the required input. The program builds the supporting PERT data base in a file for later report generation or revision. The PERT schedule and the WBS schedule may be printed and stored in a

  19. Sharing Research Models: Using Software Engineering Practices for Facilitation

    PubMed Central

    Bryant, Stephanie P.; Solano, Eric; Cantor, Susanna; Cooley, Philip C.; Wagener, Diane K.

    2011-01-01

    Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems’ behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations—such as nonintuitive user interface features and data input specifications—may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices— the iterative software development process, object-oriented methodology, and Unified Modeling Language—and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780

  20. Application of FE software Elmer to the modeling of crustal-scale processes

    NASA Astrophysics Data System (ADS)

    Maierová, Petra; Guy, Alexandra; Lexa, Ondrej; Cadek, Ondrej

    2010-05-01

    We extended Elmer (the open source finite element software for multiphysical problems, http://www.csc.fi/english/pages/elmer) by user-written procedures for the two-dimensional modeling of crustal-scale processes. The standard version of Elmer is an appropriate tool for modeling of thermomechanical convection with non-linear viscous rheology. In geophysics, it might be suitable for some type of mantle convection modeling. Unlike the mantle, the crust is very heterogeneous. It consists of materials with distinct rheological properties that are subject to highly varied conditions: low pressure and temperature near the surface of the Earth and relatively high pressure and temperature at a depth of several tens of kilometers. Moreover, the deformation in the upper crust is mostly brittle and the strain is concentrated into narrow shear zones and thrusts. In order to simulate the brittle behavior of the crust, we implemented pressure-dependent visco-plastic rheology. The material heterogeneity and chemical convection is implemented in terms of active markers. Another special feature of the crust, the moving free surface, is already included in Elmer by means of a moving computational grid. Erosion can easily be added in this scheme. We tested the properties of our formulation of plastic flow on several numerical experiments simulating the deformation of material under compressional and extensional stresses. In the first step, we examined angles of shear zones that form in a plastically deforming material for different material parameters and grid resolutions. A more complex setting of "sandbox-type" experiments containing heterogeneous material, strain-softening and boundary friction was considered as a next testing case. To illustrate the abilities of the extended Elmer software in crustal deformation studies, we present two models of geological processes: diapirism of the lower crust and a channel flow forced by indentation. Both these processes are assumed to take

  1. The discounting model selector: Statistical software for delay discounting applications.

    PubMed

    Gilroy, Shawn P; Franck, Christopher T; Hantula, Donald A

    2017-05-01

    Original, open-source computer software was developed and validated against established delay discounting methods in the literature. The software executed approximate Bayesian model selection methods from user-supplied temporal discounting data and computed the effective delay 50 (ED50) from the best performing model. Software was custom-designed to enable behavior analysts to conveniently apply recent statistical methods to temporal discounting data with the aid of a graphical user interface (GUI). The results of independent validation of the approximate Bayesian model selection methods indicated that the program provided results identical to that of the original source paper and its methods. Monte Carlo simulation (n = 50,000) confirmed that true model was selected most often in each setting. Simulation code and data for this study were posted to an online repository for use by other researchers. The model selection approach was applied to three existing delay discounting data sets from the literature in addition to the data from the source paper. Comparisons of model selected ED50 were consistent with traditional indices of discounting. Conceptual issues related to the development and use of computer software by behavior analysts and the opportunities afforded by free and open-sourced software are discussed and a review of possible expansions of this software are provided. © 2017 Society for the Experimental Analysis of Behavior.

  2. Modeling biogechemical reactive transport in a fracture zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molinero, Jorge; Samper, Javier; Yang, Chan Bing, and Zhang, Guoxiang

    2005-01-14

    A coupled model of groundwater flow, reactive solute transport and microbial processes for a fracture zone of the Aspo site at Sweden is presented. This is the model of the so-called Redox Zone Experiment aimed at evaluating the effects of tunnel construction on the geochemical conditions prevailing in a fracture granite. It is found that a model accounting for microbially-mediated geochemical processes is able to reproduce the unexpected measured increasing trends of dissolved sulfate and bicarbonate. The model is also useful for testing hypotheses regarding the role of microbial processes and evaluating the sensitivity of model results to changes inmore » biochemical parameters.« less

  3. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  4. Temperature Models for the Mexican Subduction Zone

    NASA Astrophysics Data System (ADS)

    Manea, V. C.; Kostoglodov, V.; Currie, C.; Manea, M.; Wang, K.

    2002-12-01

    It is well known that the temperature is one of the major factors which controls the seismogenic zone. The Mexican subduction zone is characterized by a very shallow flat subducting interplate in its central part (Acapulco, Oaxaca), and deeper subduction slabs northern (Jalisco) and southern (Chiapas). It has been proposed that the seismogenic zone is controlled, among other factors, by a temperature. Therefore, we have developed four two-dimensional steady state thermal models for Jalisco, Guerrero, Oaxaca and Chiapas. The updip limit of the seismogenic zone is taken between 100 §C and 150 §C, while the downdip limit is thought to be at 350 §C because of the transition from stick-slip to stable-sliding. The shape of the subducting plate is inferred from gravity and seismicity. The convergence velocity between oceanic and continental lithospheric plates is taken as the following: 5 cm/yr for Jalisco profile, 5.5 for Guerrero profile, 5.8 for Oaxaca profile, and 7.8 for Chiapas profile. The age of the subducting plates, since they are young, and provides the primary control on the forearc thermal structure, are as the following: 11 My for Jalisco profile, 14.5 My for Guerrero profile, 15 My for Oaxaca profile, and 28 My for Chiapas profile. We also introduced in the models a small quantity of frictional heating (pore pressure ratio 0.98). The value of 0.98 for pore pressure ratio was obtained for the Guerrero profile, in order to fit the intersection between the 350 §C isotherm and the subducting plate at 200 Km from trench. The value of 200 km coupling zone from trench is inferred from GPS data for the steady interseismic period and also for the last slow aseismic slip that occurred in Guerrero in 2002. We have used this value of pore pressure ratio (0.98) for all the other profiles. For the others three profiles we obtained the following coupling extents: Jalisco - 100 km, Oaxaca - 170 km and Chiapas - 125 km (from the trench). Independent constrains of the

  5. Software reliability models for fault-tolerant avionics computers and related topics

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1987-01-01

    Software reliability research is briefly described. General research topics are reliability growth models, quality of software reliability prediction, the complete monotonicity property of reliability growth, conceptual modelling of software failure behavior, assurance of ultrahigh reliability, and analysis techniques for fault-tolerant systems.

  6. Dynamic rupture models of subduction zone earthquakes with off-fault plasticity

    NASA Astrophysics Data System (ADS)

    Wollherr, S.; van Zelst, I.; Gabriel, A. A.; van Dinther, Y.; Madden, E. H.; Ulrich, T.

    2017-12-01

    Modeling tsunami-genesis based on purely elastic seafloor displacement typically underpredicts tsunami sizes. Dynamic rupture simulations allow to analyse whether plastic energy dissipation is a missing rheological component by capturing the complex interplay of the rupture front, emitted seismic waves and the free surface in the accretionary prism. Strike-slip models with off-fault plasticity suggest decreasing rupture speed and extensive plastic yielding mainly at shallow depths. For simplified subduction geometries inelastic deformation on the verge of Coulomb failure may enhance vertical displacement, which in turn favors the generation of large tsunamis (Ma, 2012). However, constraining appropriate initial conditions in terms of fault geometry, initial fault stress and strength remains challenging. Here, we present dynamic rupture models of subduction zones constrained by long-term seismo-thermo-mechanical modeling (STM) without any a priori assumption of regions of failure. The STM model provides self-consistent slab geometries, as well as stress and strength initial conditions which evolve in response to tectonic stresses, temperature, gravity, plasticity and pressure (van Dinther et al. 2013). Coseismic slip and coupled seismic wave propagation is modelled using the software package SeisSol (www.seissol.org), suited for complex fault zone structures and topography/bathymetry. SeisSol allows for local time-stepping, which drastically reduces the time-to-solution (Uphoff et al., 2017). This is particularly important in large-scale scenarios resolving small-scale features, such as the shallow angle between the megathrust fault and the free surface. Our dynamic rupture model uses a Drucker-Prager plastic yield criterion and accounts for thermal pressurization around the fault mimicking the effect of pore pressure changes due to frictional heating. We first analyze the influence of this rheology on rupture dynamics and tsunamigenic properties, i.e. seafloor

  7. Investigating Some Technical Issues on Cohesive Zone Modeling of Fracture

    NASA Technical Reports Server (NTRS)

    Wang, John T.

    2011-01-01

    This study investigates some technical issues related to the use of cohesive zone models (CZMs) in modeling fracture processes. These issues include: why cohesive laws of different shapes can produce similar fracture predictions; under what conditions CZM predictions have a high degree of agreement with linear elastic fracture mechanics (LEFM) analysis results; when the shape of cohesive laws becomes important in the fracture predictions; and why the opening profile along the cohesive zone length needs to be accurately predicted. Two cohesive models were used in this study to address these technical issues. They are the linear softening cohesive model and the Dugdale perfectly plastic cohesive model. Each cohesive model constitutes five cohesive laws of different maximum tractions. All cohesive laws have the same cohesive work rate (CWR) which is defined by the area under the traction-separation curve. The effects of the maximum traction on the cohesive zone length and the critical remote applied stress are investigated for both models. For a CZM to predict a fracture load similar to that obtained by an LEFM analysis, the cohesive zone length needs to be much smaller than the crack length, which reflects the small scale yielding condition requirement for LEFM analysis to be valid. For large-scale cohesive zone cases, the predicted critical remote applied stresses depend on the shape of cohesive models used and can significantly deviate from LEFM results. Furthermore, this study also reveals the importance of accurately predicting the cohesive zone profile in determining the critical remote applied load.

  8. Inhalation exposure to cleaning products: application of a two-zone model.

    PubMed

    Earnest, C Matt; Corsi, Richard L

    2013-01-01

    In this study, modifications were made to previously applied two-zone models to address important factors that can affect exposures during cleaning tasks. Specifically, we expand on previous applications of the two-zone model by (1) introducing the source in discrete elements (source-cells) as opposed to a complete instantaneous release, (2) placing source cells in both the inner (near person) and outer zones concurrently, (3) treating each source cell as an independent mixture of multiple constituents, and (4) tracking the time-varying liquid concentration and emission rate of each constituent in each source cell. Three experiments were performed in an environmentally controlled chamber with a thermal mannequin and a simplified pure chemical source to simulate emissions from a cleaning product. Gas phase concentration measurements were taken in the bulk air and in the breathing zone of the mannequin to evaluate the model. The mean ratio of the integrated concentration in the mannequin's breathing zone to the concentration in the outer zone was 4.3 (standard deviation, σ = 1.6). The mean ratio of measured concentration in the breathing zone to predicted concentrations in the inner zone was 0.81 (σ = 0.16). Intake fractions ranged from 1.9 × 10(-3) to 2.7 × 10(-3). Model results reasonably predict those of previous exposure monitoring studies and indicate the inadequacy of well-mixed single-zone model applications for some but not all cleaning events.

  9. The Software Architecture of Global Climate Models

    NASA Astrophysics Data System (ADS)

    Alexander, K. A.; Easterbrook, S. M.

    2011-12-01

    It has become common to compare and contrast the output of multiple global climate models (GCMs), such as in the Climate Model Intercomparison Project Phase 5 (CMIP5). However, intercomparisons of the software architecture of GCMs are almost nonexistent. In this qualitative study of seven GCMs from Canada, the United States, and Europe, we attempt to fill this gap in research. We describe the various representations of the climate system as computer programs, and account for architectural differences between models. Most GCMs now practice component-based software engineering, where Earth system components (such as the atmosphere or land surface) are present as highly encapsulated sub-models. This architecture facilitates a mix-and-match approach to climate modelling that allows for convenient sharing of model components between institutions, but it also leads to difficulty when choosing where to draw the lines between systems that are not encapsulated in the real world, such as sea ice. We also examine different styles of couplers in GCMs, which manage interaction and data flow between components. Finally, we pay particular attention to the varying levels of complexity in GCMs, both between and within models. Many GCMs have some components that are significantly more complex than others, a phenomenon which can be explained by the respective institution's research goals as well as the origin of the model components. In conclusion, although some features of software architecture have been adopted by every GCM we examined, other features show a wide range of different design choices and strategies. These architectural differences may provide new insights into variability and spread between models.

  10. Work zone safety analysis and modeling: a state-of-the-art review.

    PubMed

    Yang, Hong; Ozbay, Kaan; Ozturk, Ozgur; Xie, Kun

    2015-01-01

    Work zone safety is one of the top priorities for transportation agencies. In recent years, a considerable volume of research has sought to determine work zone crash characteristics and causal factors. Unlike other non-work zone-related safety studies (on both crash frequency and severity), there has not yet been a comprehensive review and assessment of methodological approaches for work zone safety. To address this deficit, this article aims to provide a comprehensive review of the existing extensive research efforts focused on work zone crash-related analysis and modeling, in the hopes of providing researchers and practitioners with a complete overview. Relevant literature published in the last 5 decades was retrieved from the National Work Zone Crash Information Clearinghouse and the Transport Research International Documentation database and other public digital libraries and search engines. Both peer-reviewed publications and research reports were obtained. Each study was carefully reviewed, and those that focused on either work zone crash data analysis or work zone safety modeling were identified. The most relevant studies are specifically examined and discussed in the article. The identified studies were carefully synthesized to understand the state of knowledge on work zone safety. Agreement and inconsistency regarding the characteristics of the work zone crashes discussed in the descriptive studies were summarized. Progress and issues about the current practices on work zone crash frequency and severity modeling are also explored and discussed. The challenges facing work zone safety research are then presented. The synthesis of the literature suggests that the presence of a work zone is likely to increase the crash rate. Crashes are not uniformly distributed within work zones and rear-end crashes are the most prevalent type of crashes in work zones. There was no across-the-board agreement among numerous papers reviewed on the relationship between work zone

  11. An information model for use in software management estimation and prediction

    NASA Technical Reports Server (NTRS)

    Li, Ningda R.; Zelkowitz, Marvin V.

    1993-01-01

    This paper describes the use of cluster analysis for determining the information model within collected software engineering development data at the NASA/GSFC Software Engineering Laboratory. We describe the Software Management Environment tool that allows managers to predict development attributes during early phases of a software project and the modifications we propose to allow it to develop dynamic models for better predictions of these attributes.

  12. Incompletely Mixed Surface Transient Storage Zones at River Restoration Structures: Modeling Implications

    NASA Astrophysics Data System (ADS)

    Endreny, T. A.; Robinson, J.

    2012-12-01

    River restoration structures, also known as river steering deflectors, are designed to reduce bank shear stress by generating wake zones between the bank and the constricted conveyance region. There is interest in characterizing the surface transient storage (STS) and associated biogeochemical processing in the STS zones around these structures to quantify the ecosystem benefits of river restoration. This research explored how the hydraulics around river restoration structures prohibits application of transient storage models designed for homogenous, completely mixed STS zones. We used slug and constant rate injections of a conservative tracer in a 3rd order river in Onondaga County, NY over the course of five experiments at varying flow regimes. Recovered breakthrough curves spanned a transect including the main channel and wake zone at a j-hook restoration structure. We noted divergent patterns of peak solute concentration and times within the wake zone regardless of transect location within the structure. Analysis reveals an inhomogeneous STS zone which is frequently still loading tracer after the main channel has peaked. The breakthrough curve loading patterns at the restoration structure violated the assumptions of simplified "random walk" 2 zone transient storage models which seek to identify representative STS zones and zone locations. Use of structure-scale Weiner filter based multi-rate mass transfer models to characterize STS zones residence times are similarly dependent on a representative zone location. Each 2 zone model assumes 1 zone is a completely mixed STS zone and the other a completely mixed main channel. Our research reveals limits to simple application of the recently developed 2 zone models, and raises important questions about the measurement scale necessary to identify critical STS properties at restoration sites. An explanation for the incompletely mixed STS zone may be the distinct hydraulics at restoration sites, including a constrained

  13. The inner zone electron model AE-5

    NASA Technical Reports Server (NTRS)

    Teague, M. J.; Vette, J. I.

    1972-01-01

    A description is given of the work performed in the development of the inner radiation zone electron model, AE-5. A complete description of the omnidirectional flux model is given for energy thresholds E sub T in the range 4.0 E sub T/(MeV) 0.04 and for L values in the range 2.8 L 1.2 for an epoch of October 1967. Confidence codes for certain regions of B-L space and certain energies are given based on data coverage and the assumptions made in the analysis. The electron model programs that can be supplied to a user are referred to. One of these, a program for accessing the model flux at arbitrary points in B-L space and arbitrary energies, includes the latest outer zone electron model and proton model. The model AE-5, is based on data from five satellites, OGO 1, OGO 3, 1963-38C, OV3-3, and Explorer 26, spanning the period December 1964 to December 1967.

  14. Software Engineering Tools for Scientific Models

    NASA Technical Reports Server (NTRS)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  15. Information models of software productivity - Limits on productivity growth

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1992-01-01

    Research into generalized information-metric models of software process productivity establishes quantifiable behavior and theoretical bounds. The models establish a fundamental mathematical relationship between software productivity and the human capacity for information traffic, the software product yield (system size), information efficiency, and tool and process efficiencies. An upper bound is derived that quantifies average software productivity and the maximum rate at which it may grow. This bound reveals that ultimately, when tools, methodologies, and automated assistants have reached their maximum effective state, further improvement in productivity can only be achieved through increasing software reuse. The reuse advantage is shown not to increase faster than logarithmically in the number of reusable features available. The reuse bound is further shown to be somewhat dependent on the reuse policy: a general 'reuse everything' policy can lead to a somewhat slower productivity growth than a specialized reuse policy.

  16. Software systems for modeling articulated figures

    NASA Technical Reports Server (NTRS)

    Phillips, Cary B.

    1989-01-01

    Research in computer animation and simulation of human task performance requires sophisticated geometric modeling and user interface tools. The software for a research environment should present the programmer with a powerful but flexible substrate of facilities for displaying and manipulating geometric objects, yet insure that future tools have a consistent and friendly user interface. Jack is a system which provides a flexible and extensible programmer and user interface for displaying and manipulating complex geometric figures, particularly human figures in a 3D working environment. It is a basic software framework for high-performance Silicon Graphics IRIS workstations for modeling and manipulating geometric objects in a general but powerful way. It provides a consistent and user-friendly interface across various applications in computer animation and simulation of human task performance. Currently, Jack provides input and control for applications including lighting specification and image rendering, anthropometric modeling, figure positioning, inverse kinematics, dynamic simulation, and keyframe animation.

  17. Increasing the reliability of ecological models using modern software engineering techniques

    Treesearch

    Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff

    2009-01-01

    Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...

  18. Parallel Software Model Checking

    DTIC Science & Technology

    2015-01-08

    checker. This project will explore this strategy to parallelize the generalized PDR algorithm for software model checking. It belongs to TF1 due to its ... focus on formal verification . Generalized PDR. Generalized Property Driven Rechability (GPDR) i is an algorithm for solving HORN-SMT reachability...subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 08

  19. A model of cloud application assignments in software-defined storages

    NASA Astrophysics Data System (ADS)

    Bolodurina, Irina P.; Parfenov, Denis I.; Polezhaev, Petr N.; E Shukhman, Alexander

    2017-01-01

    The aim of this study is to analyze the structure and mechanisms of interaction of typical cloud applications and to suggest the approaches to optimize their placement in storage systems. In this paper, we describe a generalized model of cloud applications including the three basic layers: a model of application, a model of service, and a model of resource. The distinctive feature of the model suggested implies analyzing cloud resources from the user point of view and from the point of view of a software-defined infrastructure of the virtual data center (DC). The innovation character of this model is in describing at the same time the application data placements, as well as the state of the virtual environment, taking into account the network topology. The model of software-defined storage has been developed as a submodel within the resource model. This model allows implementing the algorithm for control of cloud application assignments in software-defined storages. Experimental researches returned this algorithm decreases in cloud application response time and performance growth in user request processes. The use of software-defined data storages allows the decrease in the number of physical store devices, which demonstrates the efficiency of our algorithm.

  20. Model-Based Development of Automotive Electronic Climate Control Software

    NASA Astrophysics Data System (ADS)

    Kakade, Rupesh; Murugesan, Mohan; Perugu, Bhupal; Nair, Mohanan

    With increasing complexity of software in today's products, writing and maintaining thousands of lines of code is a tedious task. Instead, an alternative methodology must be employed. Model-based development is one candidate that offers several benefits and allows engineers to focus on the domain of their expertise than writing huge codes. In this paper, we discuss the application of model-based development to the electronic climate control software of vehicles. The back-to-back testing approach is presented that ensures flawless and smooth transition from legacy designs to the model-based development. Simulink report generator to create design documents from the models is presented along with its usage to run the simulation model and capture the results into the test report. Test automation using model-based development tool that support the use of unique set of test cases for several testing levels and the test procedure that is independent of software and hardware platform is also presented.

  1. Impact of Agile Software Development Model on Software Maintainability

    ERIC Educational Resources Information Center

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  2. Software Program: Software Management Guidebook

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The purpose of this NASA Software Management Guidebook is twofold. First, this document defines the core products and activities required of NASA software projects. It defines life-cycle models and activity-related methods but acknowledges that no single life-cycle model is appropriate for all NASA software projects. It also acknowledges that the appropriate method for accomplishing a required activity depends on characteristics of the software project. Second, this guidebook provides specific guidance to software project managers and team leaders in selecting appropriate life cycles and methods to develop a tailored plan for a software engineering project.

  3. Theoretical model of the helium zone plate microscope

    NASA Astrophysics Data System (ADS)

    Salvador Palau, Adrià; Bracco, Gianangelo; Holst, Bodil

    2017-01-01

    Neutral helium microscopy is a new technique currently under development. Its advantages are the low energy, charge neutrality, and inertness of the helium atoms, a potential large depth of field, and the fact that at thermal energies the helium atoms do not penetrate into any solid material. This opens the possibility, among others, for the creation of an instrument that can measure surface topology on the nanoscale, even on surfaces with high aspect ratios. One of the most promising designs for helium microscopy is the zone plate microscope. It consists of a supersonic expansion helium beam collimated by an aperture (skimmer) focused by a Fresnel zone plate onto a sample. The resolution is determined by the focal spot size, which depends on the size of the skimmer, the optics of the system, and the velocity spread of the beam through the chromatic aberrations of the zone plate. An important factor for the optics of the zone plate is the width of the outermost zone, corresponding to the smallest opening in the zone plate. The width of the outermost zone is fabrication limited to around 10 nm with present-day state-of-the-art technology. Due to the high ionization potential of neutral helium atoms, it is difficult to build efficient helium detectors. Therefore, it is crucial to optimize the microscope design to maximize the intensity for a given resolution and width of the outermost zone. Here we present an optimization model for the helium zone plate microscope. Assuming constant resolution and width of the outermost zone, we are able to reduce the problem to a two-variable problem (zone plate radius and object distance) and we show that for a given beam temperature and pressure, there is always a single intensity maximum. We compare our model with the highest-resolution zone plate focusing images published and show that the intensity can be increased seven times. Reducing the width of the outermost zone to 10 nm leads to an increase in intensity of more than 8000

  4. A Comparison and Evaluation of Real-Time Software Systems Modeling Languages

    NASA Technical Reports Server (NTRS)

    Evensen, Kenneth D.; Weiss, Kathryn Anne

    2010-01-01

    A model-driven approach to real-time software systems development enables the conceptualization of software, fostering a more thorough understanding of its often complex architecture and behavior while promoting the documentation and analysis of concerns common to real-time embedded systems such as scheduling, resource allocation, and performance. Several modeling languages have been developed to assist in the model-driven software engineering effort for real-time systems, and these languages are beginning to gain traction with practitioners throughout the aerospace industry. This paper presents a survey of several real-time software system modeling languages, namely the Architectural Analysis and Design Language (AADL), the Unified Modeling Language (UML), Systems Modeling Language (SysML), the Modeling and Analysis of Real-Time Embedded Systems (MARTE) UML profile, and the AADL for UML profile. Each language has its advantages and disadvantages, and in order to adequately describe a real-time software system's architecture, a complementary use of multiple languages is almost certainly necessary. This paper aims to explore these languages in the context of understanding the value each brings to the model-driven software engineering effort and to determine if it is feasible and practical to combine aspects of the various modeling languages to achieve more complete coverage in architectural descriptions. To this end, each language is evaluated with respect to a set of criteria such as scope, formalisms, and architectural coverage. An example is used to help illustrate the capabilities of the various languages.

  5. Performance Evaluation of 3d Modeling Software for Uav Photogrammetry

    NASA Astrophysics Data System (ADS)

    Yanagi, H.; Chikatsu, H.

    2016-06-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  6. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  7. Soil moisture dynamics modeling considering multi-layer root zone.

    PubMed

    Kumar, R; Shankar, V; Jat, M K

    2013-01-01

    The moisture uptake by plant from soil is a key process for plant growth and movement of water in the soil-plant system. A non-linear root water uptake (RWU) model was developed for a multi-layer crop root zone. The model comprised two parts: (1) model formulation and (2) moisture flow prediction. The developed model was tested for its efficiency in predicting moisture depletion in a non-uniform root zone. A field experiment on wheat (Triticum aestivum) was conducted in the sub-temperate sub-humid agro-climate of Solan, Himachal Pradesh, India. Model-predicted soil moisture parameters, i.e., moisture status at various depths, moisture depletion and soil moisture profile in the root zone, are in good agreement with experiment results. The results of simulation emphasize the utility of the RWU model across different agro-climatic regions. The model can be used for sound irrigation management especially in water-scarce humid, temperate, arid and semi-arid regions and can also be integrated with a water transport equation to predict the solute uptake by plant biomass.

  8. Bayesian Modeling of Exposure and Airflow Using Two-Zone Models

    PubMed Central

    Zhang, Yufen; Banerjee, Sudipto; Yang, Rui; Lungu, Claudiu; Ramachandran, Gurumurthy

    2009-01-01

    Mathematical modeling is being increasingly used as a means for assessing occupational exposures. However, predicting exposure in real settings is constrained by lack of quantitative knowledge of exposure determinants. Validation of models in occupational settings is, therefore, a challenge. Not only do the model parameters need to be known, the models also need to predict the output with some degree of accuracy. In this paper, a Bayesian statistical framework is used for estimating model parameters and exposure concentrations for a two-zone model. The model predicts concentrations in a zone near the source and far away from the source as functions of the toluene generation rate, air ventilation rate through the chamber, and the airflow between near and far fields. The framework combines prior or expert information on the physical model along with the observed data. The framework is applied to simulated data as well as data obtained from the experiments conducted in a chamber. Toluene vapors are generated from a source under different conditions of airflow direction, the presence of a mannequin, and simulated body heat of the mannequin. The Bayesian framework accounts for uncertainty in measurement as well as in the unknown rate of airflow between the near and far fields. The results show that estimates of the interzonal airflow are always close to the estimated equilibrium solutions, which implies that the method works efficiently. The predictions of near-field concentration for both the simulated and real data show nice concordance with the true values, indicating that the two-zone model assumptions agree with the reality to a large extent and the model is suitable for predicting the contaminant concentration. Comparison of the estimated model and its margin of error with the experimental data thus enables validation of the physical model assumptions. The approach illustrates how exposure models and information on model parameters together with the knowledge of

  9. The Site-Scale Saturated Zone Flow Model for Yucca Mountain

    NASA Astrophysics Data System (ADS)

    Al-Aziz, E.; James, S. C.; Arnold, B. W.; Zyvoloski, G. A.

    2006-12-01

    This presentation provides a reinterpreted conceptual model of the Yucca Mountain site-scale flow system subject to all quality assurance procedures. The results are based on a numerical model of site-scale saturated zone beneath Yucca Mountain, which is used for performance assessment predictions of radionuclide transport and to guide future data collection and modeling activities. This effort started from the ground up with a revised and updated hydrogeologic framework model, which incorporates the latest lithology data, and increased grid resolution that better resolves the hydrogeologic framework, which was updated throughout the model domain. In addition, faults are much better represented using the 250× 250- m2 spacing (compared to the previous model's 500× 500-m2 spacing). Data collected since the previous model calibration effort have been included and they comprise all Nye County water-level data through Phase IV of their Early Warning Drilling Program. Target boundary fluxes are derived from the newest (2004) Death Valley Regional Flow System model from the US Geologic Survey. A consistent weighting scheme assigns importance to each measured water-level datum and boundary flux extracted from the regional model. The numerical model is calibrated by matching these weighted water level measurements and boundary fluxes using parameter estimation techniques, along with more informal comparisons of the model to hydrologic and geochemical information. The model software (hydrologic simulation code FEHM~v2.24 and parameter estimation software PEST~v5.5) and model setup facilitates efficient calibration of multiple conceptual models. Analyses evaluate the impact of these updates and additional data on the modeled potentiometric surface and the flowpaths emanating from below the repository. After examining the heads and permeabilities obtained from the calibrated models, we present particle pathways from the proposed repository and compare them to those from the

  10. Domain and Specification Models for Software Engineering

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui

    1992-01-01

    This paper discusses our approach to representing application domain knowledge for specific software engineering tasks. Application domain knowledge is embodied in a domain model. Domain models are used to assist in the creation of specification models. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model. One aspect of the system-hierarchical organization is described in detail.

  11. Model Package Report: Central Plateau Vadose Zone Geoframework Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Springer, Sarah D.

    The purpose of the Central Plateau Vadose Zone (CPVZ) Geoframework model (GFM) is to provide a reasonable, consistent, and defensible three-dimensional (3D) representation of the vadose zone beneath the Central Plateau at the Hanford Site to support the Composite Analysis (CA) vadose zone contaminant fate and transport models. The GFM is a 3D representation of the subsurface geologic structure. From this 3D geologic model, exported results in the form of point, surface, and/or volumes are used as inputs to populate and assemble the various numerical model architectures, providing a 3D-layered grid that is consistent with the GFM. The objective ofmore » this report is to define the process used to produce a hydrostratigraphic model for the vadose zone beneath the Hanford Site Central Plateau and the corresponding CA domain.« less

  12. Generalized Pseudo-Reaction Zone Model for Non-Ideal Explosives

    NASA Astrophysics Data System (ADS)

    Wescott, Bradley

    2007-06-01

    The pseudo-reaction zone model was proposed to improve engineering scale simulations when using Detonation Shock Dynamics with high explosives that have a slow reaction component. In this work an extension of the pseudo-reaction zone model is developed for non-ideal explosives that propagate well below their steady-planar Chapman-Jouguet velocity. A programmed burn method utilizing Detonation Shock Dynamics and a detonation velocity dependent pseudo-reaction rate has been developed for non-ideal explosives and applied to the explosive mixture of ammonium nitrate and fuel oil (ANFO). The pseudo-reaction rate is calibrated to the experimentally obtained normal detonation velocity---shock curvature relation. The generalized pseudo-reaction zone model proposed here predicts the cylinder expansion to within 1% by accounting for the slow reaction in ANFO.

  13. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  14. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  15. Studying the Accuracy of Software Process Elicitation: The User Articulated Model

    ERIC Educational Resources Information Center

    Crabtree, Carlton A.

    2010-01-01

    Process models are often the basis for demonstrating improvement and compliance in software engineering organizations. A descriptive model is a type of process model describing the human activities in software development that actually occur. The purpose of a descriptive model is to provide a documented baseline for further process improvement…

  16. Modelling Fault Zone Evolution: Implications for fluid flow.

    NASA Astrophysics Data System (ADS)

    Moir, H.; Lunn, R. J.; Shipton, Z. K.

    2009-04-01

    Flow simulation models are of major interest to many industries including hydrocarbon, nuclear waste, sequestering of carbon dioxide and mining. One of the major uncertainties in these models is in predicting the permeability of faults, principally in the detailed structure of the fault zone. Studying the detailed structure of a fault zone is difficult because of the inaccessible nature of sub-surface faults and also because of their highly complex nature; fault zones show a high degree of spatial and temporal heterogeneity i.e. the properties of the fault change as you move along the fault, they also change with time. It is well understood that faults influence fluid flow characteristics. They may act as a conduit or a barrier or even as both by blocking flow across the fault while promoting flow along it. Controls on fault hydraulic properties include cementation, stress field orientation, fault zone components and fault zone geometry. Within brittle rocks, such as granite, fracture networks are limited but provide the dominant pathway for flow within this rock type. Research at the EU's Soultz-sous-Forệt Hot Dry Rock test site [Evans et al., 2005] showed that 95% of flow into the borehole was associated with a single fault zone at 3490m depth, and that 10 open fractures account for the majority of flow within the zone. These data underline the critical role of faults in deep flow systems and the importance of achieving a predictive understanding of fault hydraulic properties. To improve estimates of fault zone permeability, it is important to understand the underlying hydro-mechanical processes of fault zone formation. In this research, we explore the spatial and temporal evolution of fault zones in brittle rock through development and application of a 2D hydro-mechanical finite element model, MOPEDZ. The authors have previously presented numerical simulations of the development of fault linkage structures from two or three pre-existing joints, the results of

  17. Work zone lane closure analysis model.

    DOT National Transportation Integrated Search

    2009-10-01

    At the Alabama Department of Transportation (ALDOT), the tool used by traffic engineers to predict whether a queue will form at a freeway work zone is the Excel-based "Lane Rental Model" developed at the Oklahoma Department of Transportation (OkDOT) ...

  18. Software Simplifies the Sharing of Numerical Models

    NASA Technical Reports Server (NTRS)

    2014-01-01

    To ease the sharing of climate models with university students, Goddard Space Flight Center awarded SBIR funding to Reston, Virginia-based Parabon Computation Inc., a company that specializes in cloud computing. The firm developed a software program capable of running climate models over the Internet, and also created an online environment for people to collaborate on developing such models.

  19. Modeling and managing risk early in software development

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.

    1993-01-01

    In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.

  20. Software Surface Modeling and Grid Generation Steering Committee

    NASA Technical Reports Server (NTRS)

    Smith, Robert E. (Editor)

    1992-01-01

    It is a NASA objective to promote improvements in the capability and efficiency of computational fluid dynamics. Grid generation, the creation of a discrete representation of the solution domain, is an essential part of computational fluid dynamics. However, grid generation about complex boundaries requires sophisticated surface-model descriptions of the boundaries. The surface modeling and the associated computation of surface grids consume an extremely large percentage of the total time required for volume grid generation. Efficient and user friendly software systems for surface modeling and grid generation are critical for computational fluid dynamics to reach its potential. The papers presented here represent the state-of-the-art in software systems for surface modeling and grid generation. Several papers describe improved techniques for grid generation.

  1. An analytical model for non-conservative pollutants mixing in the surf zone.

    PubMed

    Ki, Seo Jin; Hwang, Jin Hwan; Kang, Joo-Hyon; Kim, Joon Ha

    2009-01-01

    Accurate simulation of the surf zone is a prerequisite to improve beach management as well as to understand the fundamentals of fate and transport of contaminants. In the present study, a diagnostic model modified from a classic solute model is provided to illuminate non-conservative pollutants behavior in the surf zone. To readily understand controlling processes in the surf zone, a new dimensionless quantity is employed with index of kappa number (K, a ratio of inactivation rate to transport rate of microbial pollutant in the surf zone), which was then evaluated under different environmental frames during a week simulation period. The sensitivity analysis showed that hydrodynamics and concentration gradients in the surf zone mostly depend on n (number of rip currents), indicating that n should be carefully adjusted in the model. The simulation results reveal, furthermore, that large deviation typically occurs in the daytime, signifying inactivation of fecal indicator bacteria is the main process to control surf zone water quality during the day. Overall, the analytical model shows a good agreement between predicted and synthetic data (R(2) = 0.51 and 0.67 for FC and ENT, respectively) for the simulated period, amplifying its potential use in the surf zone modelling. It is recommended that when the dimensionless index is much larger than 0.5, the present modified model can predict better than the conventional model, but if index is smaller than 0.5, the conventional model is more efficient with respect to time and cost.

  2. Software Testing and Verification in Climate Model Development

    NASA Technical Reports Server (NTRS)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  3. A software development and evolution model based on decision-making

    NASA Technical Reports Server (NTRS)

    Wild, J. Christian; Dong, Jinghuan; Maly, Kurt

    1991-01-01

    Design is a complex activity whose purpose is to construct an artifact which satisfies a set of constraints and requirements. However the design process is not well understood. The software design and evolution process is the focus of interest, and a three dimensional software development space organized around a decision-making paradigm is presented. An initial instantiation of this model called 3DPM(sub p) which was partly implemented, is presented. Discussion of the use of this model in software reuse and process management is given.

  4. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    NASA Technical Reports Server (NTRS)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  5. Composable Framework Support for Software-FMEA Through Model Execution

    NASA Astrophysics Data System (ADS)

    Kocsis, Imre; Patricia, Andras; Brancati, Francesco; Rossi, Francesco

    2016-08-01

    Performing Failure Modes and Effect Analysis (FMEA) during software architecture design is becoming a basic requirement in an increasing number of domains; however, due to the lack of standardized early design phase model execution, classic SW-FMEA approaches carry significant risks and are human effort-intensive even in processes that use Model-Driven Engineering.Recently, modelling languages with standardized executable semantics have emerged. Building on earlier results, this paper describes framework support for generating executable error propagation models from such models during software architecture design. The approach carries the promise of increased precision, decreased risk and more automated execution for SW-FMEA during dependability- critical system development.

  6. Multiphasic modeling of charged solute transport across articular cartilage: Application of multi-zone finite-bath model.

    PubMed

    Arbabi, Vahid; Pouran, Behdad; Weinans, Harrie; Zadpoor, Amir A

    2016-06-14

    Charged and uncharged solutes penetrate through cartilage to maintain the metabolic function of chondrocytes and to possibly restore or further breakdown the cartilage tissue in different stages of osteoarthritis. In this study the transport of charged solutes across the various zones of cartilage was quantified, taken into account the physicochemical interactions between the solute and the cartilage constituents. A multiphasic finite-bath finite element (FE) model was developed to simulate equine cartilage diffusion experiments that used a negatively charged contrast agent (ioxaglate) in combination with serial micro-computed tomography (micro-CT) to measure the diffusion. By comparing the FE model with the experimental data both the diffusion coefficient of ioxaglate and the fixed charge density (FCD) were obtained. In the multiphasic model, cartilage was divided into multiple (three) zones to help understand how diffusion coefficient and FCD vary across cartilage thickness. The direct effects of charged solute-FCD interaction on diffusion were investigated by comparing the diffusion coefficients derived from the multiphasic and biphasic-solute models. We found a relationship between the FCD obtained by the multiphasic model and ioxaglate partitioning obtained from micro-CT experiments. Using our multi-zone multiphasic model, diffusion coefficient of the superficial zone was up to ten-fold higher than that of the middle zone, while the FCD of the middle zone was up to almost two-fold higher than that of the superficial zone. In conclusion, the developed finite-bath multiphasic model provides us with a non-destructive method by which we could obtain both diffusion coefficient and FCD of different cartilage zones. The outcomes of the current work will also help understand how charge of the bath affects the diffusion of a charged molecule and also predict the diffusion behavior of a charged solute across articular cartilage. Copyright © 2016 Elsevier Ltd. All

  7. Relating Cohesive Zone Model to Linear Elastic Fracture Mechanics

    NASA Technical Reports Server (NTRS)

    Wang, John T.

    2010-01-01

    The conditions required for a cohesive zone model (CZM) to predict a failure load of a cracked structure similar to that obtained by a linear elastic fracture mechanics (LEFM) analysis are investigated in this paper. This study clarifies why many different phenomenological cohesive laws can produce similar fracture predictions. Analytical results for five cohesive zone models are obtained, using five different cohesive laws that have the same cohesive work rate (CWR-area under the traction-separation curve) but different maximum tractions. The effect of the maximum traction on the predicted cohesive zone length and the remote applied load at fracture is presented. Similar to the small scale yielding condition for an LEFM analysis to be valid. the cohesive zone length also needs to be much smaller than the crack length. This is a necessary condition for a CZM to obtain a fracture prediction equivalent to an LEFM result.

  8. Three-dimensional structure and seismicity beneath the Central Vanuatu subduction zone

    NASA Astrophysics Data System (ADS)

    Foix, Oceane; Crawford, Wayne; Pelletier, Bernard; Regnier, Marc; Garaebiti, Esline; Koulakov, Ivan

    2017-04-01

    The 1400-km long Vanuatu subduction zone results from subduction of the oceanic Australian plate (OAP) beneath the North-Fijian microplate (NFM). Seismic and volcanic activity are both high, and several morphologic features enter into subduction, affecting seismicity and probably plate coupling. The Entrecasteaux Ridge, West-Torres plateau, and Bougainville seamount currently enter into subduction below the large forearc islands of Santo and Malekula. This collision coincides with a strongly decreased local convergence velocity rate - 35 mm/yr compared to 120-160 mm/yr to the north and south - and significant uplift on the overriding plate, indicating a high degree of deformation. The close proximity of large uplifted forearc islands to the trench provides excellent coverage of the megathrust seismogenic zone for a seismological study. We used 10 months of seismological data collected using the 30-instrument land and sea ARC-VANUATU seismology network to construct a 3D velocity model — using the LOTOS joint location/model inversion software — and locate 11655 earthquakes using the NonLinLoc software suite. The 3-D model reveals low P and S velocities in the first tens of kilometers beneath both islands, probably due to water infiltration in the heavily faulted upper plate. The model also suggests the presence of a subducted seamount beneath south Santo. The earthquake locations reveal a complex interaction of faults and stress zones related to high and highly variable deformation. Both brittle deformation and the seismogenic zone depth limits vary along-slab and earthquake clusters are identified beneath central and south Santo, at about 10-30 km of depth, and southwest of Malekula island between 10-20 km depth.

  9. A Model Independent S/W Framework for Search-Based Software Testing

    PubMed Central

    Baik, Jongmoon

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314

  10. Modeling Zone-3 Protection with Generic Relay Models for Dynamic Contingency Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Qiuhua; Vyakaranam, Bharat GNVSR; Diao, Ruisheng

    This paper presents a cohesive approach for calculating and coordinating the settings of multiple zone-3 protections for dynamic contingency analysis. The zone-3 protections are represented by generic distance relay models. A two-step approach for determining zone-3 relay settings is proposed. The first step is to calculate settings, particularly, the reach, of each zone-3 relay individually by iteratively running line open-end fault short circuit analysis; the blinder is also employed and properly set to meet the industry standard under extreme loading conditions. The second step is to systematically coordinate the protection settings of the zone-3 relays. The main objective of thismore » coordination step is to address the over-reaching issues. We have developed a tool to automate the proposed approach and generate the settings of all distance relays in a PSS/E dyr format file. The calculated zone-3 settings have been tested on a modified IEEE 300 system using a dynamic contingency analysis tool (DCAT).« less

  11. Visualising higher order Brillouin zones with applications

    NASA Astrophysics Data System (ADS)

    Andrew, R. C.; Salagaram, T.; Chetty, N.

    2017-05-01

    A key concept in material science is the relationship between the Bravais lattice, the reciprocal lattice and the resulting Brillouin zones (BZ). These zones are often complicated shapes that are hard to construct and visualise without the use of sophisticated software, even by professional scientists. We have used a simple sorting algorithm to construct BZ of any order for a chosen Bravais lattice that is easy to implement in any scientific programming language. The resulting zones can then be visualised using freely available plotting software. This method has pedagogical value for upper-level undergraduate students since, along with other computational methods, it can be used to illustrate how constant-energy surfaces combine with these zones to create van Hove singularities in the density of states. In this paper we apply our algorithm along with the empirical pseudopotential method and the 2D equivalent of the tetrahedron method to show how they can be used in a simple software project to investigate this interaction for a 2D crystal. This project not only enhances students’ fundamental understanding of the principles involved but also improves transferable coding skills.

  12. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  13. Software life cycle dynamic simulation model: The organizational performance submodel

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1985-01-01

    The submodel structure of a software life cycle dynamic simulation model is described. The software process is divided into seven phases, each with product, staff, and funding flows. The model is subdivided into an organizational response submodel, a management submodel, a management influence interface, and a model analyst interface. The concentration here is on the organizational response model, which simulates the performance characteristics of a software development subject to external and internal influences. These influences emanate from two sources: the model analyst interface, which configures the model to simulate the response of an implementing organization subject to its own internal influences, and the management submodel that exerts external dynamic control over the production process. A complete characterization is given of the organizational response submodel in the form of parameterized differential equations governing product, staffing, and funding levels. The parameter values and functions are allocated to the two interfaces.

  14. Software Piracy Detection Model Using Ant Colony Optimization Algorithm

    NASA Astrophysics Data System (ADS)

    Astiqah Omar, Nor; Zakuan, Zeti Zuryani Mohd; Saian, Rizauddin

    2017-06-01

    Internet enables information to be accessible anytime and anywhere. This scenario creates an environment whereby information can be easily copied. Easy access to the internet is one of the factors which contribute towards piracy in Malaysia as well as the rest of the world. According to a survey conducted by Compliance Gap BSA Global Software Survey in 2013 on software piracy, found out that 43 percent of the software installed on PCs around the world was not properly licensed, the commercial value of the unlicensed installations worldwide was reported to be 62.7 billion. Piracy can happen anywhere including universities. Malaysia as well as other countries in the world is faced with issues of piracy committed by the students in universities. Piracy in universities concern about acts of stealing intellectual property. It can be in the form of software piracy, music piracy, movies piracy and piracy of intellectual materials such as books, articles and journals. This scenario affected the owner of intellectual property as their property is in jeopardy. This study has developed a classification model for detecting software piracy. The model was developed using a swarm intelligence algorithm called the Ant Colony Optimization algorithm. The data for training was collected by a study conducted in Universiti Teknologi MARA (Perlis). Experimental results show that the model detection accuracy rate is better as compared to J48 algorithm.

  15. An Investigation of Software Scaffolds Supporting Modeling Practices

    NASA Astrophysics Data System (ADS)

    Fretz, Eric B.; Wu, Hsin-Kai; Zhang, Baohui; Davis, Elizabeth A.; Krajcik, Joseph S.; Soloway, Elliot

    2002-08-01

    Modeling of complex systems and phenomena is of value in science learning and is increasingly emphasised as an important component of science teaching and learning. Modeling engages learners in desired pedagogical activities. These activities include practices such as planning, building, testing, analysing, and critiquing. Designing realistic models is a difficult task. Computer environments allow the creation of dynamic and even more complex models. One way of bringing the design of models within reach is through the use of scaffolds. Scaffolds are intentional assistance provided to learners from a variety of sources, allowing them to complete tasks that would otherwise be out of reach. Currently, our understanding of how scaffolds in software tools assist learners is incomplete. In this paper the scaffolds designed into a dynamic modeling software tool called Model-It are assessed in terms of their ability to support learners' use of modeling practices. Four pairs of middle school students were video-taped as they used the modeling software for three hours, spread over a two week time frame. Detailed analysis of coded videotape transcripts provided evidence of the importance of scaffolds in supporting the use of modeling practices. Learners used a variety of modeling practices, the majority of which occurred in conjunction with scaffolds. The use of three tool scaffolds was assessed as directly as possible, and these scaffolds were seen to support a variety of modeling practices. An argument is made for the continued empirical validation of types and instances of tool scaffolds, and further investigation of the important role of teacher and peer scaffolding in the use of scaffolded tools.

  16. Multiscale Modeling of Grain-Boundary Fracture: Cohesive Zone Models Parameterized From Atomistic Simulations

    NASA Technical Reports Server (NTRS)

    Glaessgen, Edward H.; Saether, Erik; Phillips, Dawn R.; Yamakov, Vesselin

    2006-01-01

    A multiscale modeling strategy is developed to study grain boundary fracture in polycrystalline aluminum. Atomistic simulation is used to model fundamental nanoscale deformation and fracture mechanisms and to develop a constitutive relationship for separation along a grain boundary interface. The nanoscale constitutive relationship is then parameterized within a cohesive zone model to represent variations in grain boundary properties. These variations arise from the presence of vacancies, intersticies, and other defects in addition to deviations in grain boundary angle from the baseline configuration considered in the molecular dynamics simulation. The parameterized cohesive zone models are then used to model grain boundaries within finite element analyses of aluminum polycrystals.

  17. Experiences in Teaching a Graduate Course on Model-Driven Software Development

    ERIC Educational Resources Information Center

    Tekinerdogan, Bedir

    2011-01-01

    Model-driven software development (MDSD) aims to support the development and evolution of software intensive systems using the basic concepts of model, metamodel, and model transformation. In parallel with the ongoing academic research, MDSD is more and more applied in industrial practices. After being accepted both by a broad community of…

  18. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  19. RAD-ADAPT: Software for modelling clonogenic assay data in radiation biology.

    PubMed

    Zhang, Yaping; Hu, Kaiqiang; Beumer, Jan H; Bakkenist, Christopher J; D'Argenio, David Z

    2017-04-01

    We present a comprehensive software program, RAD-ADAPT, for the quantitative analysis of clonogenic assays in radiation biology. Two commonly used models for clonogenic assay analysis, the linear-quadratic model and single-hit multi-target model, are included in the software. RAD-ADAPT uses maximum likelihood estimation method to obtain parameter estimates with the assumption that cell colony count data follow a Poisson distribution. The program has an intuitive interface, generates model prediction plots, tabulates model parameter estimates, and allows automatic statistical comparison of parameters between different groups. The RAD-ADAPT interface is written using the statistical software R and the underlying computations are accomplished by the ADAPT software system for pharmacokinetic/pharmacodynamic systems analysis. The use of RAD-ADAPT is demonstrated using an example that examines the impact of pharmacologic ATM and ATR kinase inhibition on human lung cancer cell line A549 after ionizing radiation. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Comparing two-zone models of dust exposure.

    PubMed

    Jones, Rachael M; Simmons, Catherine E; Boelter, Fred W

    2011-09-01

    The selection and application of mathematical models to work tasks is challenging. Previously, we developed and evaluated a semi-empirical two-zone model that predicts time-weighted average (TWA) concentrations (Ctwa) of dust emitted during the sanding of drywall joint compound. Here, we fit the emission rate and random air speed variables of a mechanistic two-zone model to testing event data and apply and evaluate the model using data from two field studies. We found that the fitted random air speed values and emission rate were sensitive to (i) the size of the near-field and (ii) the objective function used for fitting, but this did not substantially impact predicted dust Ctwa. The mechanistic model predictions were lower than the semi-empirical model predictions and measured respirable dust Ctwa at Site A but were within an acceptable range. At Site B, a 10.5 m3 room, the mechanistic model did not capture the observed difference between PBZ and area Ctwa. The model predicted uniform mixing and predicted dust Ctwa up to an order of magnitude greater than was measured. We suggest that applications of the mechanistic model be limited to contexts where the near-field volume is very small relative to the far-field volume.

  1. System Operations Studies : Feeder System Model. User's Manual.

    DOT National Transportation Integrated Search

    1982-11-01

    The Feeder System Model (FSM) is one of the analytic models included in the System Operations Studies (SOS) software package developed for urban transit systems analysis. The objective of the model is to assign a proportion of the zone-to-zone travel...

  2. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    NASA Astrophysics Data System (ADS)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool

  3. Algorithms for Coastal-Zone Color-Scanner Data

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Software for Nimbus-7 Coastal-Zone Color-Scanner (CZCS) derived products consists of set of scientific algorithms for extracting information from CZCS-gathered data. Software uses CZCS-generated Calibrated RadianceTemperature (CRT) tape as input and outputs computer-compatible tape and film product.

  4. Software Transition Project Retrospectives and the Application of SEL Effort Estimation Model and Boehm's COCOMO to Complex Software Transition Projects

    NASA Technical Reports Server (NTRS)

    McNeill, Justin

    1995-01-01

    The Multimission Image Processing Subsystem (MIPS) at the Jet Propulsion Laboratory (JPL) has managed transitions of application software sets from one operating system and hardware platform to multiple operating systems and hardware platforms. As a part of these transitions, cost estimates were generated from the personal experience of in-house developers and managers to calculate the total effort required for such projects. Productivity measures have been collected for two such transitions, one very large and the other relatively small in terms of source lines of code. These estimates used a cost estimation model similar to the Software Engineering Laboratory (SEL) Effort Estimation Model. Experience in transitioning software within JPL MIPS have uncovered a high incidence of interface complexity. Interfaces, both internal and external to individual software applications, have contributed to software transition project complexity, and thus to scheduling difficulties and larger than anticipated design work on software to be ported.

  5. Software For Computing Reliability Of Other Software

    NASA Technical Reports Server (NTRS)

    Nikora, Allen; Antczak, Thomas M.; Lyu, Michael

    1995-01-01

    Computer Aided Software Reliability Estimation (CASRE) computer program developed for use in measuring reliability of other software. Easier for non-specialists in reliability to use than many other currently available programs developed for same purpose. CASRE incorporates mathematical modeling capabilities of public-domain Statistical Modeling and Estimation of Reliability Functions for Software (SMERFS) computer program and runs in Windows software environment. Provides menu-driven command interface; enabling and disabling of menu options guides user through (1) selection of set of failure data, (2) execution of mathematical model, and (3) analysis of results from model. Written in C language.

  6. New generation of exploration tools: interactive modeling software and microcomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly,more » these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.« less

  7. HOMER® Energy Modeling Software V2.63

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, Tom

    2003-12-31

    The HOMER® energy modeling software is a tool for designing and analyzing hybrid power systems, which contain a mix of conventional generators, cogeneration, wind turbines, solar photovoltaic, hydropower, batteries, fuel cells, hydropower, biomass and other inputs.

  8. HOMER® Energy Modeling Software V2.64

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, Tom

    2003-12-31

    The HOMER® energy modeling software is a tool for designing and analyzing hybrid power systems, which contain a mix of conventional generators, cogeneration, wind turbines, solar photovoltaic, hydropower, batteries, fuel cells, hydropower, biomass and other inputs.

  9. HOMER® Energy Modeling Software V2.65

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, Tom

    2008-12-31

    The HOMER® energy modeling software is a tool for designing and analyzing hybrid power systems, which contain a mix of conventional generators, cogeneration, wind turbines, solar photovoltaic, hydropower, batteries, fuel cells, hydropower, biomass and other inputs.

  10. HOMER® Energy Modeling Software V2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, Tom

    2003-12-31

    The HOMER® energy modeling software is a tool for designing and analyzing hybrid power systems, which contain a mix of conventional generators, cogeneration, wind turbines, solar photovoltaic, hydropower, batteries, fuel cells, hydropower, biomass and other inputs.

  11. HOMER® Energy Modeling Software V2.19

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, Tom

    2008-12-31

    The HOMER® energy modeling software is a tool for designing and analyzing hybrid power systems, which contain a mix of conventional generators, cogeneration, wind turbines, solar photovoltaic, hydropower, batteries, fuel cells, hydropower, biomass and other inputs.

  12. HOMER® Energy Modeling Software V2.67

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, Tom

    2008-12-31

    The HOMER® energy modeling software is a tool for designing and analyzing hybrid power systems, which contain a mix of conventional generators, cogeneration, wind turbines, solar photovoltaic, hydropower, batteries, fuel cells, hydropower, biomass and other inputs.

  13. Path generation algorithm for UML graphic modeling of aerospace test software

    NASA Astrophysics Data System (ADS)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao

    2018-03-01

    Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.

  14. Evaluating Sustainability Models for Interoperability through Brokering Software

    NASA Astrophysics Data System (ADS)

    Pearlman, Jay; Benedict, Karl; Best, Mairi; Fyfe, Sue; Jacobs, Cliff; Michener, William; Nativi, Stefano; Powers, Lindsay; Turner, Andrew

    2016-04-01

    Sustainability of software and research support systems is an element of innovation that is not often discussed. Yet, sustainment is essential if we expect research communities to make the time investment to learn and adopt new technologies. As the Research Data Alliance (RDA) is developing new approaches to interoperability, the question of uptake and sustainability is important. Brokering software sustainability is one of the areas that is being addressed in RDA. The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and challenges, and policy and legal considerations. Results of this comprehensive analysis highlight advantages and disadvantages of the various models with respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis that suggest that hybrid funding models present the most likely avenue to long term sustainability.

  15. Observation model and parameter partials for the JPL geodetic GPS modeling software GPSOMC

    NASA Technical Reports Server (NTRS)

    Sovers, O. J.; Border, J. S.

    1988-01-01

    The physical models employed in GPSOMC and the modeling module of the GIPSY software system developed at JPL for analysis of geodetic Global Positioning Satellite (GPS) measurements are described. Details of the various contributions to range and phase observables are given, as well as the partial derivatives of the observed quantities with respect to model parameters. A glossary of parameters is provided to enable persons doing data analysis to identify quantities in the current report with their counterparts in the computer programs. There are no basic model revisions, with the exceptions of an improved ocean loading model and some new options for handling clock parametrization. Such misprints as were discovered were corrected. Further revisions include modeling improvements and assurances that the model description is in accord with the current software.

  16. Are Earth System model software engineering practices fit for purpose? A case study.

    NASA Astrophysics Data System (ADS)

    Easterbrook, S. M.; Johns, T. C.

    2009-04-01

    We present some analysis and conclusions from a case study of the culture and practices of scientists at the Met Office and Hadley Centre working on the development of software for climate and Earth System models using the MetUM infrastructure. The study examined how scientists think about software correctness, prioritize their requirements in making changes, and develop a shared understanding of the resulting models. We conclude that highly customized techniques driven strongly by scientific research goals have evolved for verification and validation of such models. In a formal software engineering context these represents costly, but invaluable, software integration tests with considerable benefits. The software engineering practices seen also exhibit recognisable features of both agile and open source software development projects - self-organisation of teams consistent with a meritocracy rather than top-down organisation, extensive use of informal communication channels, and software developers who are generally also users and science domain experts. We draw some general conclusions on whether these practices work well, and what new software engineering challenges may lie ahead as Earth System models become ever more complex and petascale computing becomes the norm.

  17. Toward Broadband Source Modeling for the Himalayan Collision Zone

    NASA Astrophysics Data System (ADS)

    Miyake, H.; Koketsu, K.; Kobayashi, H.; Sharma, B.; Mishra, O. P.; Yokoi, T.; Hayashida, T.; Bhattarai, M.; Sapkota, S. N.

    2017-12-01

    The Himalayan collision zone is characterized by the significant tectonic setting. There are earthquakes with low-angle thrust faulting as well as continental outerrise earthquakes. Recently several historical earthquakes have been identified by active fault surveys [e.g., Sapkota et al., 2013]. We here investigate source scaling for the Himalayan collision zone as a fundamental factor to construct source models toward seismic hazard assessment. As for the source scaling for collision zones, Yen and Ma [2011] reported the subduction-zone source scaling in Taiwan, and pointed out the non-self-similar scaling due to the finite crustal thickness. On the other hand, current global analyses of stress drop do not show abnormal values for the continental collision zones [e.g., Allmann and Shearer, 2009]. Based on the compile profiling of finite thickness of the curst and dip angle variations, we discuss whether the bending exists for the Himalayan source scaling and implications on stress drop that will control strong ground motions. Due to quite low-angle dip faulting, recent earthquakes in the Himalayan collision zone showed the upper bound of the current source scaling of rupture area vs. seismic moment (< Mw 8.0), and does not show significant bending of the source scaling. Toward broadband source modeling for ground motion prediction, we perform empirical Green's function simulations for the 2009 Butan and 2015 Gorkha earthquake sequence to quantify both long- and short-period source spectral levels.

  18. Classroom Model of a Wadati Zone.

    ERIC Educational Resources Information Center

    Shea, James H.

    1980-01-01

    Describes a plexiglass and aluminum model of a Wadati zone suitable for classroom exercises and demonstrations in earth science to let students test the hypothesis that earthquake hypocenters near oceanic trenches tend to occur along planes that dip away from the trenches, toward associated island arc or continental mountain chain. (Author/JN)

  19. On the use and the performance of software reliability growth models

    NASA Technical Reports Server (NTRS)

    Keiller, Peter A.; Miller, Douglas R.

    1991-01-01

    We address the problem of predicting future failures for a piece of software. The number of failures occurring during a finite future time interval is predicted from the number failures observed during an initial period of usage by using software reliability growth models. Two different methods for using the models are considered: straightforward use of individual models, and dynamic selection among models based on goodness-of-fit and quality-of-prediction criteria. Performance is judged by the relative error of the predicted number of failures over future finite time intervals relative to the number of failures eventually observed during the intervals. Six of the former models and eight of the latter are evaluated, based on their performance on twenty data sets. Many open questions remain regarding the use and the performance of software reliability growth models.

  20. Software engineering the mixed model for genome-wide association studies on large samples.

    PubMed

    Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J

    2009-11-01

    Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.

  1. The Robust Software Feedback Model: An Effective Waterfall Model Tailoring for Space SW

    NASA Astrophysics Data System (ADS)

    Tipaldi, Massimo; Gotz, Christoph; Ferraguto, Massimo; Troiano, Luigi; Bruenjes, Bernhard

    2013-08-01

    The selection of the most suitable software life cycle process is of paramount importance in any space SW project. Despite being the preferred choice, the waterfall model is often exposed to some criticism. As matter of fact, its main assumption of moving to a phase only when the preceding one is completed and perfected (and under the demanding SW schedule constraints) is not easily attainable. In this paper, a tailoring of the software waterfall model (named “Robust Software Feedback Model”) is presented. The proposed methodology sorts out these issues by combining a SW waterfall model with a SW prototyping approach. The former is aligned with the SW main production line and is based on the full ECSS-E-ST-40C life-cycle reviews, whereas the latter is carried out in advance versus the main SW streamline (so as to inject its lessons learnt into the main streamline) and is based on a lightweight approach.

  2. Exponential order statistic models of software reliability growth

    NASA Technical Reports Server (NTRS)

    Miller, D. R.

    1985-01-01

    Failure times of a software reliabilty growth process are modeled as order statistics of independent, nonidentically distributed exponential random variables. The Jelinsky-Moranda, Goel-Okumoto, Littlewood, Musa-Okumoto Logarithmic, and Power Law models are all special cases of Exponential Order Statistic Models, but there are many additional examples also. Various characterizations, properties and examples of this class of models are developed and presented.

  3. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1993-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a testbed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  4. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1992-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  5. A Survey of Software Reliability Modeling and Estimation

    DTIC Science & Technology

    1983-09-01

    considered include: the Jelinski-Moranda Model, the ,Geometric Model,’ and Musa’s Model. A Monte -Carlo study of the behavior of the ’V"’"*least squares...ceedings Number 261, 1979, pp. 34-1, 34-11. IoelAmrit, AGieboSSukert, Alan and Goel, Ararat , "A Guidebookfor Software Reliability Assessment, 1980

  6. Geological modeling of a fault zone in clay rocks at the Mont-Terri laboratory (Switzerland)

    NASA Astrophysics Data System (ADS)

    Kakurina, M.; Guglielmi, Y.; Nussbaum, C.; Valley, B.

    2016-12-01

    Clay-rich formations are considered to be a natural barrier for radionuclides or fluids (water, hydrocarbons, CO2) migration. However, little is known about the architecture of faults affecting clay formations because of their quick alteration at the Earth's surface. The Mont Terri Underground Research Laboratory provides exceptional conditions to investigate an un-weathered, perfectly exposed clay fault zone architecture and to conduct fault activation experiments that allow explore the conditions for stability of such clay faults. Here we show first results from a detailed geological model of the Mont Terri Main Fault architecture, using GoCad software, a detailed structural analysis of 6 fully cored and logged 30-to-50m long and 3-to-15m spaced boreholes crossing the fault zone. These high-definition geological data were acquired within the Fault Slip (FS) experiment project that consisted in fluid injections in different intervals within the fault using the SIMFIP probe to explore the conditions for the fault mechanical and seismic stability. The Mont Terri Main Fault "core" consists of a thrust zone about 0.8 to 3m wide that is bounded by two major fault planes. Between these planes, there is an assembly of distinct slickensided surfaces and various facies including scaly clays, fault gouge and fractured zones. Scaly clay including S-C bands and microfolds occurs in larger zones at top and bottom of the Mail Fault. A cm-thin layer of gouge, that is known to accommodate high strain parts, runs along the upper fault zone boundary. The non-scaly part mainly consists of undeformed rock block, bounded by slickensides. Such a complexity as well as the continuity of the two major surfaces are hard to correlate between the different boreholes even with the high density of geological data within the relatively small volume of the experiment. This may show that a poor strain localization occurred during faulting giving some perspectives about the potential for

  7. Modelling of diesel engine fuelled with biodiesel using engine simulation software

    NASA Astrophysics Data System (ADS)

    Said, Mohd Farid Muhamad; Said, Mazlan; Aziz, Azhar Abdul

    2012-06-01

    This paper is about modelling of a diesel engine that operates using biodiesel fuels. The model is used to simulate or predict the performance and combustion of the engine by simplified the geometry of engine component in the software. The model is produced using one-dimensional (1D) engine simulation software called GT-Power. The fuel properties library in the software is expanded to include palm oil based biodiesel fuels. Experimental works are performed to investigate the effect of biodiesel fuels on the heat release profiles and the engine performance curves. The model is validated with experimental data and good agreement is observed. The simulation results show that combustion characteristics and engine performances differ when biodiesel fuels are used instead of no. 2 diesel fuel.

  8. Next generation lightweight mirror modeling software

    NASA Astrophysics Data System (ADS)

    Arnold, William R.; Fitzgerald, Matthew; Rosa, Rubin Jaca; Stahl, H. Philip

    2013-09-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 3-5 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any text editor, all the shell thickness parameters and suspension spring rates are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite

  9. Next-Generation Lightweight Mirror Modeling Software

    NASA Technical Reports Server (NTRS)

    Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, Phil

    2013-01-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models possible

  10. Next Generation Lightweight Mirror Modeling Software

    NASA Technical Reports Server (NTRS)

    Arnold, William; Fitzgerald, Matthew; Stahl, Philip

    2013-01-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models possible.

  11. Next Generation Lightweight Mirror Modeling Software

    NASA Technical Reports Server (NTRS)

    Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, H. Philip

    2013-01-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models easier.

  12. Multimedia Delivery of Coastal Zone Management Training.

    ERIC Educational Resources Information Center

    Clark, M. J.; And Others

    1995-01-01

    Describes Coastal Zone Management (CZM) multimedia course modules, educational software written by the GeoData Institute at the University of Southamptom for an environmental management undergraduate course. Examines five elements that converge to create CZM multimedia teaching: course content, source material, a hardware/software delivery system,…

  13. Dynamic Model of Basic Oxygen Steelmaking Process Based on Multi-zone Reaction Kinetics: Model Derivation and Validation

    NASA Astrophysics Data System (ADS)

    Rout, Bapin Kumar; Brooks, Geoff; Rhamdhani, M. Akbar; Li, Zushu; Schrama, Frank N. H.; Sun, Jianjun

    2018-04-01

    A multi-zone kinetic model coupled with a dynamic slag generation model was developed for the simulation of hot metal and slag composition during the basic oxygen furnace (BOF) operation. The three reaction zones (i) jet impact zone, (ii) slag-bulk metal zone, (iii) slag-metal-gas emulsion zone were considered for the calculation of overall refining kinetics. In the rate equations, the transient rate parameters were mathematically described as a function of process variables. A micro and macroscopic rate calculation methodology (micro-kinetics and macro-kinetics) were developed to estimate the total refining contributed by the recirculating metal droplets through the slag-metal emulsion zone. The micro-kinetics involves developing the rate equation for individual droplets in the emulsion. The mathematical models for the size distribution of initial droplets, kinetics of simultaneous refining of elements, the residence time in the emulsion, and dynamic interfacial area change were established in the micro-kinetic model. In the macro-kinetics calculation, a droplet generation model was employed and the total amount of refining by emulsion was calculated by summing the refining from the entire population of returning droplets. A dynamic FetO generation model based on oxygen mass balance was developed and coupled with the multi-zone kinetic model. The effect of post-combustion on the evolution of slag and metal composition was investigated. The model was applied to a 200-ton top blowing converter and the simulated value of metal and slag was found to be in good agreement with the measured data. The post-combustion ratio was found to be an important factor in controlling FetO content in the slag and the kinetics of Mn and P in a BOF process.

  14. Subduction zone decoupling/retreat modeling explains south Tibet (Xigaze) and other supra-subduction zone ophiolites and their UHP mineral phases

    NASA Astrophysics Data System (ADS)

    Butler, Jared P.; Beaumont, Christopher

    2017-04-01

    The plate tectonic setting in which proto-ophiolite 'oceanic' lithosphere is created remains controversial with a number of environments suggested. Recent opinions tend to coalesce around supra-subduction zone (SSZ) forearc extension, with a popular conceptual model in which the proto-ophiolite forms during foundering of oceanic lithosphere at the time of spontaneous or induced onset of subduction. This mechanism is favored in intra-oceanic settings where the subducting lithosphere is old and the upper plate is young and thin. We investigate an alternative mechanism; namely, decoupling of the subducting oceanic lithosphere in the forearc of an active continental margin, followed by subduction zone (trench) retreat and creation of a forearc oceanic rift basin, containing proto-ophiolite lithosphere, between the continental margin and the retreating subduction zone. A template of 2D numerical model experiments examines the trade-off between strength of viscous coupling in the lithospheric subduction channel and net slab pull of the subducting lithosphere. Three tectonic styles are observed: 1) C, continuous subduction without forearc decoupling; 2) R, forearc decoupling followed by rapid subduction zone retreat; 3) B, breakoff of subducting lithosphere followed by re-initiation of subduction and in some cases, forearc decoupling (B-R). In one case (BA-B-R; where BA denotes backarc) subduction zone retreat follows backarc rifting. Subduction zone decoupling is analyzed using frictional-plastic yield theory and the Stefan solution for the separation of plates containing a viscous fluid. The numerical model results are used to explain the formation of Xigaze group ophiolites, southern Tibet, which formed in the Lhasa terrane forearc, likely following earlier subduction and not necessarily during subduction initiation. Either there was normal coupled subduction before subduction zone decoupling, or precursor slab breakoff, subduction re-initiation and then decoupling

  15. Studies in Software Cost Model Behavior: Do We Really Understand Cost Model Performance?

    NASA Technical Reports Server (NTRS)

    Lum, Karen; Hihn, Jairus; Menzies, Tim

    2006-01-01

    While there exists extensive literature on software cost estimation techniques, industry practice continues to rely upon standard regression-based algorithms. These software effort models are typically calibrated or tuned to local conditions using local data. This paper cautions that current approaches to model calibration often produce sub-optimal models because of the large variance problem inherent in cost data and by including far more effort multipliers than the data supports. Building optimal models requires that a wider range of models be considered while correctly calibrating these models requires rejection rules that prune variables and records and use multiple criteria for evaluating model performance. The main contribution of this paper is to document a standard method that integrates formal model identification, estimation, and validation. It also documents what we call the large variance problem that is a leading cause of cost model brittleness or instability.

  16. Integrated Functional and Executional Modelling of Software Using Web-Based Databases

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Marietta, Roberta

    1998-01-01

    NASA's software subsystems undergo extensive modification and updates over the operational lifetimes. It is imperative that modified software should satisfy safety goals. This report discusses the difficulties encountered in doing so and discusses a solution based on integrated modelling of software, use of automatic information extraction tools, web technology and databases.

  17. Experimental Evaluation of a Serious Game for Teaching Software Process Modeling

    ERIC Educational Resources Information Center

    Chaves, Rafael Oliveira; von Wangenheim, Christiane Gresse; Furtado, Julio Cezar Costa; Oliveira, Sandro Ronaldo Bezerra; Santos, Alex; Favero, Eloi Luiz

    2015-01-01

    Software process modeling (SPM) is an important area of software engineering because it provides a basis for managing, automating, and supporting software process improvement (SPI). Teaching SPM is a challenging task, mainly because it lays great emphasis on theory and offers few practical exercises. Furthermore, as yet few teaching approaches…

  18. CmapTools: A Software Environment for Knowledge Modeling and Sharing

    NASA Technical Reports Server (NTRS)

    Canas, Alberto J.

    2004-01-01

    In an ongoing collaborative effort between a group of NASA Ames scientists and researchers at the Institute for Human and Machine Cognition (IHMC) of the University of West Florida, a new version of CmapTools has been developed that enable scientists to construct knowledge models of their domain of expertise, share them with other scientists, make them available to anybody on the Internet with access to a Web browser, and peer-review other scientists models. These software tools have been successfully used at NASA to build a large-scale multimedia on Mars and in knowledge model on Habitability Assessment. The new version of the software places emphasis on greater usability for experts constructing their own knowledge models, and support for the creation of large knowledge models with large number of supporting resources in the forms of images, videos, web pages, and other media. Additionally, the software currently allows scientists to cooperate with each other in the construction, sharing and criticizing of knowledge models. Scientists collaborating from remote distances, for example researchers at the Astrobiology Institute, can concurrently manipulate the knowledge models they are viewing without having to do this at a special videoconferencing facility.

  19. Numerical Modelling of Subduction Zones: a New Beginning

    NASA Astrophysics Data System (ADS)

    Ficini, Eleonora; Dal Zilio, Luca; Doglioni, Carlo; Gerya, Taras V.

    2016-04-01

    Subduction zones are one of the most studied although still controversial geodynamic process. Is it a passive or an active mechanism in the frame of plate tectonics? How subduction initiates? What controls the differences among the slabs and related orogens and accretionary wedges? The geometry and kinematics at plate boundaries point to a "westerly" polarized flow of plates, which implies a relative opposed flow of the underlying Earth's mantle, being the decoupling located at about 100-200 km depth in the low-velocity zone or LVZ (Doglioni and Panza, 2015 and references therein). This flow is the simplest explanation for determining the asymmetric pattern of subduction zones; in fact "westerly" directed slabs are steeper and deeper with respect to the "easterly or northeasterly" directed ones, that are less steep and shallower, and two end members of orogens associated to the downgoing slabs can be distinguished in terms of topography, type of rocks, magmatism, backarc spreading or not, foredeep subsidence rate, etc.. The classic asymmetry comparing the western Pacific slabs and orogens (low topography and backarc spreading in the upper plate) and the eastern Pacific subduction zones (high topography and deep rocks involved in the upper plate) cannot be ascribed to the age of the subducting lithosphere. In fact, the same asymmetry can be recognized all over the world regardless the type and age of the subducting lithosphere, being rather controlled by the geographic polarity of the subduction. All plate boundaries move "west". Present numerical modelling set of subduction zones is based on the idea that a subducting slab is primarily controlled by its negative buoyancy. However, there are several counterarguments against this assumption, which is not able to explain the global asymmetric aforementioned signatures. Moreover, petrological reconstructions of the lithospheric and underlying mantle composition, point for a much smaller negative buoyancy than predicted

  20. Root Zone Water Quality Model (RZWQM2): Model use, calibration, and validation

    USDA-ARS?s Scientific Manuscript database

    The Root Zone Water Quality Model (RZWQM2) has been used widely for simulating agricultural management effects on crop production and soil and water quality. Although it is a one-dimensional model it has many desirable features for the modeling community. This paper outlines the principles of calibr...

  1. Orthographic Software Modelling: A Novel Approach to View-Based Software Engineering

    NASA Astrophysics Data System (ADS)

    Atkinson, Colin

    The need to support multiple views of complex software architectures, each capturing a different aspect of the system under development, has been recognized for a long time. Even the very first object-oriented analysis/design methods such as the Booch method and OMT supported a number of different diagram types (e.g. structural, behavioral, operational) and subsequent methods such as Fusion, Kruchten's 4+1 views and the Rational Unified Process (RUP) have added many more views over time. Today's leading modeling languages such as the UML and SysML, are also oriented towards supporting different views (i.e. diagram types) each able to portray a different facets of a system's architecture. More recently, so called enterprise architecture frameworks such as the Zachman Framework, TOGAF and RM-ODP have become popular. These add a whole set of new non-functional views to the views typically emphasized in traditional software engineering environments.

  2. Explorative study on management model of tourism business zone at Kuta, Bali

    NASA Astrophysics Data System (ADS)

    Astawa, I. K.; Suardani, A. A. P.; Harmini, A. A. A. N.

    2018-01-01

    Business activities through asset management of indigenous village of Kuta provide an opportunity for the community to participate in improving their welfare. This study aims to analyze the management model of Kuta’s tourism business zone, the involvement of stakeholders in the management of Kuta’s tourism business zone in indigenous village of Kuta and the implications of each business tourism zone in indigenous village of Kuta in the level of community welfare in each zone. Data collection was done by observation, interview, questionnaire, and documentation. The main instrument of this study is the researchers themselves assisted with interview guideline. The results showed that the management model has been arranged in 5 tourism business zones in indigenous village of Kuta. The involvement of all stakeholders in the management of the tourism business zone follows the procedure of execution of duties and provides security, comfort and certainty of doing business activities at each zone. The implications of the tourism business in the level of community welfare in each zone in indigenous village of Kuta have been able to bring happiness in business and all community are satisfied with the income they earned from work in each business zone.

  3. Variable-intercept panel model for deformation zoning of a super-high arch dam.

    PubMed

    Shi, Zhongwen; Gu, Chongshi; Qin, Dong

    2016-01-01

    This study determines dam deformation similarity indexes based on an analysis of deformation zoning features and panel data clustering theory, with comprehensive consideration to the actual deformation law of super-high arch dams and the spatial-temporal features of dam deformation. Measurement methods of these indexes are studied. Based on the established deformation similarity criteria, the principle used to determine the number of dam deformation zones is constructed through entropy weight method. This study proposes the deformation zoning method for super-high arch dams and the implementation steps, analyzes the effect of special influencing factors of different dam zones on the deformation, introduces dummy variables that represent the special effect of dam deformation, and establishes a variable-intercept panel model for deformation zoning of super-high arch dams. Based on different patterns of the special effect in the variable-intercept panel model, two panel analysis models were established to monitor fixed and random effects of dam deformation. Hausman test method of model selection and model effectiveness assessment method are discussed. Finally, the effectiveness of established models is verified through a case study.

  4. Whole earth modeling: developing and disseminating scientific software for computational geophysics.

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.

    2016-12-01

    Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.

  5. THE U.S. ENVIRONMENTAL PROTECTION AGENCY VISUAL PLUMES MODELING SOFTWARE

    EPA Science Inventory

    The U.S. Environmental Protection Agency's Center for Exposure Assessment Modeling (CEAM) at the Ecosystems Research Division in Athens, Georgia develops environmental exposure models, including plume models, and provides technical assistance to model users. The mixing zone and f...

  6. JAMS - a software platform for modular hydrological modelling

    NASA Astrophysics Data System (ADS)

    Kralisch, Sven; Fischer, Christian

    2015-04-01

    Current challenges of understanding and assessing the impacts of climate and land use changes on environmental systems demand for an ever-increasing integration of data and process knowledge in corresponding simulation models. Software frameworks that allow for a seamless creation of integrated models based on less complex components (domain models, process simulation routines) have therefore gained increasing attention during the last decade. JAMS is an Open-Source software framework that has been especially designed to cope with the challenges of eco-hydrological modelling. This is reflected by (i) its flexible approach for representing time and space, (ii) a strong separation of process simulation components from the declarative description of more complex models using domain specific XML, (iii) powerful analysis and visualization functions for spatial and temporal input and output data, and (iv) parameter optimization and uncertainty analysis functions commonly used in environmental modelling. Based on JAMS, different hydrological and nutrient-transport simulation models were implemented and successfully applied during the last years. We will present the JAMS core concepts and give an overview of models, simulation components and support tools available for that framework. Sample applications will be used to underline the advantages of component-based model designs and to show how JAMS can be used to address the challenges of integrated hydrological modelling.

  7. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand.

    PubMed

    Chung, Beom Sun; Chung, Min Suk; Shin, Byeong Seok; Kwon, Koojoo

    2018-02-19

    The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. © 2018 The Korean Academy of Medical Sciences.

  8. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand

    PubMed Central

    2018-01-01

    Background The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. Methods On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. Results All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. Conclusion These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. PMID:29441756

  9. Dynamic topography in subduction zones: insights from laboratory models

    NASA Astrophysics Data System (ADS)

    Bajolet, Flora; Faccenna, Claudio; Funiciello, Francesca

    2014-05-01

    The topography in subduction zones can exhibit very complex patterns due to the variety of forces operating this setting. If we can deduce the theoretical isostatic value from density structure of the lithosphere, the effect of flexural bending and the dynamic component of topography are difficult to quantify. In this work, we attempt to measure and analyze the topography of the overriding plate during subduction compared to a pure shortening setting. We use analog models where the lithospheres are modeled by thin-sheet layers of silicone putty lying on low-viscosity syrup (asthenosphere). The model is shorten by a piston pushing an oceanic plate while a continental plate including a weak zone to localize the deformation is fixed. In one type of experiments, the oceanic plate bends and subducts underneath the continental one; in a second type the two plates are in contact without any trench, and thus simply shorten. The topography evolution is monitored with a laser-scanner. In the shortening model, the elevation increases progressively, especially in the weak zone, and is consistent with expected isostatic values. In the subduction model, the topography is characterized, from the piston to the back-wall, by a low elevation of the dense oceanic plate, a flexural bulge, the trench forming a deep depression, the highly elevated weak zone, and the continental upper plate of intermediate elevation. The topography of the upper plate is consistent with isostatic values for very early stages, but exhibits lower elevations than expected for later stages. For a same amount of shortening of the continental plate, the thickening is the same and the plate should have the same elevation in both types of models. However, comparing the topography at 20, 29 and 39% of shortening, we found that the weak zone is 0.4 to 0.6 mm lower when there is an active subduction. Theses values correspond to 2.6 to 4 km in nature. Although theses values are high, there are of the same order as

  10. Zone model predictive control: a strategy to minimize hyper- and hypoglycemic events.

    PubMed

    Grosman, Benyamin; Dassau, Eyal; Zisser, Howard C; Jovanovic, Lois; Doyle, Francis J

    2010-07-01

    Development of an artificial pancreas based on an automatic closed-loop algorithm that uses a subcutaneous insulin pump and continuous glucose sensor is a goal for biomedical engineering research. However, closing the loop for the artificial pancreas still presents many challenges, including model identification and design of a control algorithm that will keep the type 1 diabetes mellitus subject in normoglycemia for the longest duration and under maximal safety considerations. An artificial pancreatic beta-cell based on zone model predictive control (zone-MPC) that is tuned automatically has been evaluated on the University of Virginia/University of Padova Food and Drug Administration-accepted metabolic simulator. Zone-MPC is applied when a fixed set point is not defined and the control variable objective can be expressed as a zone. Because euglycemia is usually defined as a range, zone-MPC is a natural control strategy for the artificial pancreatic beta-cell. Clinical data usually include discrete information about insulin delivery and meals, which can be used to generate personalized models. It is argued that mapping clinical insulin administration and meal history through two different second-order transfer functions improves the identification accuracy of these models. Moreover, using mapped insulin as an additional state in zone-MPC enriches information about past control moves, thereby reducing the probability of overdosing. In this study, zone-MPC is tested in three different modes using unannounced and announced meals at their nominal value and with 40% uncertainty. Ten adult in silico subjects were evaluated following a scenario of mixed meals with 75, 75, and 50 grams of carbohydrates (CHOs) consumed at 7 am, 1 pm, and 8 pm, respectively. Zone-MPC results are compared to those of the "optimal" open-loop preadjusted treatment. Zone-MPC succeeds in maintaining glycemic responses closer to euglycemia compared to the "optimal" open-loop treatment in te three

  11. Modeling of a 3DTV service in the software-defined networking architecture

    NASA Astrophysics Data System (ADS)

    Wilczewski, Grzegorz

    2014-11-01

    In this article a newly developed concept towards modeling of a multimedia service offering stereoscopic motion imagery is presented. Proposed model is based on the approach of utilization of Software-defined Networking or Software Defined Networks architecture (SDN). The definition of 3D television service spanning SDN concept is identified, exposing basic characteristic of a 3DTV service in a modern networking organization layout. Furthermore, exemplary functionalities of the proposed 3DTV model are depicted. It is indicated that modeling of a 3DTV service in the Software-defined Networking architecture leads to multiplicity of improvements, especially towards flexibility of a service supporting heterogeneity of end user devices.

  12. A revised dislocation model of interseismic deformation of the Cascadia subduction zone

    USGS Publications Warehouse

    Wang, Kelin; Wells, Ray E.; Mazzotti, Stephane; Hyndman, Roy D.; Sagiya, Takeshi

    2003-01-01

    CAS3D‐2, a new three‐dimensional (3‐D) dislocation model, is developed to model interseismic deformation rates at the Cascadia subduction zone. The model is considered a snapshot description of the deformation field that changes with time. The effect of northward secular motion of the central and southern Cascadia forearc sliver is subtracted to obtain the effective convergence between the subducting plate and the forearc. Horizontal deformation data, including strain rates and surface velocities from Global Positioning System (GPS) measurements, provide primary geodetic constraints, but uplift rate data from tide gauges and leveling also provide important validations for the model. A locked zone, based on the results of previous thermal models constrained by heat flow observations, is located entirely offshore beneath the continental slope. Similar to previous dislocation models, an effective zone of downdip transition from locking to full slip is used, but the slip deficit rate is assumed to decrease exponentially with downdip distance. The exponential function resolves the problem of overpredicting coastal GPS velocities and underpredicting inland velocities by previous models that used a linear downdip transition. A wide effective transition zone (ETZ) partially accounts for stress relaxation in the mantle wedge that cannot be simulated by the elastic model. The pattern of coseismic deformation is expected to be different from that of interseismic deformation at present, 300 years after the last great subduction earthquake. The downdip transition from full rupture to no slip should take place over a much narrower zone.

  13. Analogue modelling of inclined, brittle-ductile transpression: Testing analytical models through natural shear zones (external Betics)

    NASA Astrophysics Data System (ADS)

    Barcos, L.; Díaz-Azpiroz, M.; Balanyá, J. C.; Expósito, I.; Jiménez-Bonilla, A.; Faccenna, C.

    2016-07-01

    The combination of analytical and analogue models gives new opportunities to better understand the kinematic parameters controlling the evolution of transpression zones. In this work, we carried out a set of analogue models using the kinematic parameters of transpressional deformation obtained by applying a general triclinic transpression analytical model to a tabular-shaped shear zone in the external Betic Chain (Torcal de Antequera massif). According to the results of the analytical model, we used two oblique convergence angles to reproduce the main structural and kinematic features of structural domains observed within the Torcal de Antequera massif (α = 15° for the outer domains and α = 30° for the inner domain). Two parallel inclined backstops (one fixed and the other mobile) reproduce the geometry of the shear zone walls of the natural case. Additionally, we applied digital particle image velocimetry (PIV) method to calculate the velocity field of the incremental deformation. Our results suggest that the spatial distribution of the main structures observed in the Torcal de Antequera massif reflects different modes of strain partitioning and strain localization between two domain types, which are related to the variation in the oblique convergence angle and the presence of steep planar velocity - and rheological - discontinuities (the shear zone walls in the natural case). In the 15° model, strain partitioning is simple and strain localization is high: a single narrow shear zone is developed close and parallel to the fixed backstop, bounded by strike-slip faults and internally deformed by R and P shears. In the 30° model, strain partitioning is strong, generating regularly spaced oblique-to-the backstops thrusts and strike-slip faults. At final stages of the 30° experiment, deformation affects the entire model box. Our results show that the application of analytical modelling to natural transpressive zones related to upper crustal deformation

  14. Probabilistic Modeling and Evaluation of Surf Zone Injury Occurrence along the Delaware Coast

    NASA Astrophysics Data System (ADS)

    Doelp, M.; Puleo, J. A.

    2017-12-01

    Beebe Healthcare in Lewes, DE collected along the DE coast surf zone injury (SZI) data for seven summer seasons from 2010 through 2016. Data include, but are not limited to, time of injury, gender, age, and activity. Over 2000 injuries were recorded over the seven year period, including 116 spinal injuries and three fatalities. These injuries are predominantly wave related incidents including wading (41%), bodysurfing (26%), and body-boarding (20%). Despite the large number of injuries, beach associated hazards do not receive the same level of awareness that rip currents receive. Injury population statistics revealed those between the ages of 11 and 15 years old suffered the greatest proportion of injuries (18.8%). Male water users were twice as likely to sustain injury as their female counterparts. Also, non-locals were roughly six times more likely to sustain injury than locals. In 2016, five or more injuries occurred for 18.5% of the days sampled, and no injuries occurred for 31.4% of the sample days. The episodic nature of injury occurrence and population statistics indicate the importance of environmental conditions and human behavior on surf zone injuries. Higher order statistics are necessary to effectively assess SZI cause and likelihood of occurrence on a particular day. A Bayesian network using Netica software (Norsys) was constructed to model SZI and predict changes in injury likelihood on an hourly basis. The network incorporates environmental data collected by weather stations, NDBC buoy #44009, USACE buoy at Bethany Beach, and by researcher personnel on the beach. The Bayesian model includes prior (e.g., historic) information to infer relationships between provided parameters. Sensitivity analysis determined the most influential variables to injury likelihood are population, water temperature, nearshore wave height, beach slope, and the day of the week. Forecasting during the 2017 summer season will test model ability to predict injury likelihood.

  15. Slab1.0: A three-dimensional model of global subduction zone geometries

    NASA Astrophysics Data System (ADS)

    Hayes, Gavin P.; Wald, David J.; Johnson, Rebecca L.

    2012-01-01

    We describe and present a new model of global subduction zone geometries, called Slab1.0. An extension of previous efforts to constrain the two-dimensional non-planar geometry of subduction zones around the focus of large earthquakes, Slab1.0 describes the detailed, non-planar, three-dimensional geometry of approximately 85% of subduction zones worldwide. While the model focuses on the detailed form of each slab from their trenches through the seismogenic zone, where it combines data sets from active source and passive seismology, it also continues to the limits of their seismic extent in the upper-mid mantle, providing a uniform approach to the definition of the entire seismically active slab geometry. Examples are shown for two well-constrained global locations; models for many other regions are available and can be freely downloaded in several formats from our new Slab1.0 website, http://on.doi.gov/d9ARbS. We describe improvements in our two-dimensional geometry constraint inversion, including the use of `average' active source seismic data profiles in the shallow trench regions where data are otherwise lacking, derived from the interpolation between other active source seismic data along-strike in the same subduction zone. We include several analyses of the uncertainty and robustness of our three-dimensional interpolation methods. In addition, we use the filtered, subduction-related earthquake data sets compiled to build Slab1.0 in a reassessment of previous analyses of the deep limit of the thrust interface seismogenic zone for all subduction zones included in our global model thus far, concluding that the width of these seismogenic zones is on average 30% larger than previous studies have suggested.

  16. Effective Team Support: From Modeling to Software Agents

    NASA Technical Reports Server (NTRS)

    Remington, Roger W. (Technical Monitor); John, Bonnie; Sycara, Katia

    2003-01-01

    The purpose of this research contract was to perform multidisciplinary research between CMU psychologists, computer scientists and engineers and NASA researchers to design a next generation collaborative system to support a team of human experts and intelligent agents. To achieve robust performance enhancement of such a system, we had proposed to perform task and cognitive modeling to thoroughly understand the impact technology makes on the organization and on key individual personnel. Guided by cognitively-inspired requirements, we would then develop software agents that support the human team in decision making, information filtering, information distribution and integration to enhance team situational awareness. During the period covered by this final report, we made substantial progress in modeling infrastructure and task infrastructure. Work is continuing under a different contract to complete empirical data collection, cognitive modeling, and the building of software agents to support the teams task.

  17. Corridor-based forecasts of work-zone impacts for freeways.

    DOT National Transportation Integrated Search

    2011-08-09

    This project developed an analysis methodology and associated software implementation for the evaluation of : significant work zone impacts on freeways in North Carolina. The FREEVAL-WZ software tool allows the analyst : to predict the operational im...

  18. ESPC Common Model Architecture Earth System Modeling Framework (ESMF) Software and Application Development

    DTIC Science & Technology

    2015-09-30

    originate from NASA , NOAA , and community modeling efforts, and support for creation of the suite was shared by sponsors from other agencies. ESPS...Framework (ESMF) Software and Application Development Cecelia Deluca NESII/CIRES/ NOAA Earth System Research Laboratory 325 Broadway Boulder, CO...Capability (NUOPC) was established between NOAA and Navy to develop a common software architecture for easy and efficient interoperability. The

  19. EPA MODELING TOOLS FOR CAPTURE ZONE DELINEATION

    EPA Science Inventory

    The EPA Office of Research and Development supports a step-wise modeling approach for design of wellhead protection areas for water supply wells. A web-based WellHEDSS (wellhead decision support system) is under development for determining when simple capture zones (e.g., centri...

  20. Building the Scientific Modeling Assistant: An interactive environment for specialized software design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.

  1. POWERLIB: SAS/IML Software for Computing Power in Multivariate Linear Models

    PubMed Central

    Johnson, Jacqueline L.; Muller, Keith E.; Slaughter, James C.; Gurka, Matthew J.; Gribbin, Matthew J.; Simpson, Sean L.

    2014-01-01

    The POWERLIB SAS/IML software provides convenient power calculations for a wide range of multivariate linear models with Gaussian errors. The software includes the Box, Geisser-Greenhouse, Huynh-Feldt, and uncorrected tests in the “univariate” approach to repeated measures (UNIREP), the Hotelling Lawley Trace, Pillai-Bartlett Trace, and Wilks Lambda tests in “multivariate” approach (MULTIREP), as well as a limited but useful range of mixed models. The familiar univariate linear model with Gaussian errors is an important special case. For estimated covariance, the software provides confidence limits for the resulting estimated power. All power and confidence limits values can be output to a SAS dataset, which can be used to easily produce plots and tables for manuscripts. PMID:25400516

  2. Graphical Technique to Support the Teaching/Learning Process of Software Process Reference Models

    NASA Astrophysics Data System (ADS)

    Espinosa-Curiel, Ismael Edrein; Rodríguez-Jacobo, Josefina; Fernández-Zepeda, José Alberto

    In this paper, we propose a set of diagrams to visualize software process reference models (PRM). The diagrams, called dimods, are the combination of some visual and process modeling techniques such as rich pictures, mind maps, IDEF and RAD diagrams. We show the use of this technique by designing a set of dimods for the Mexican Software Industry Process Model (MoProSoft). Additionally, we perform an evaluation of the usefulness of dimods. The result of the evaluation shows that dimods may be a support tool that facilitates the understanding, memorization, and learning of software PRMs in both, software development organizations and universities. The results also show that dimods may have advantages over the traditional description methods for these types of models.

  3. Model format for a vaccine stability report and software solutions.

    PubMed

    Shin, Jinho; Southern, James; Schofield, Timothy

    2009-11-01

    A session of the International Association for Biologicals Workshop on Stability Evaluation of Vaccine, a Life Cycle Approach was devoted to a model format for a vaccine stability report, and software solutions. Presentations highlighted the utility of a model format that will conform to regulatory requirements and the ICH common technical document. However, there need be flexibility to accommodate individual company practices. Adoption of a model format is premised upon agreement regarding content between industry and regulators, and ease of use. Software requirements will include ease of use and protections against inadvertent misspecification of stability design or misinterpretation of program output.

  4. Integrated Functional and Executional Modelling of Software Using Web-Based Databases

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Marietta, Roberta

    1998-01-01

    NASA's software subsystems undergo extensive modification and updates over the operational lifetimes. It is imperative that modified software should satisfy safety goals. This report discusses the difficulties encountered in doing so and discusses a solution based on integrated modelling of software, use of automatic information extraction tools, web technology and databases. To appear in an article of Journal of Database Management.

  5. ARC Software and Models

    Science.gov Websites

    produce software code and methodologies that are transferred to TARDEC and industry partners. These constraints", ASME Dynamic Systems and Control Conference, 2013, DOI:10.1115/DSCC2013-3935 Software Monitoring",IEEE Transactions on Control Systems Technology, DOI:10.1109/TCST.2012.2217143 Fast

  6. Modeling Latent Growth Curves With Incomplete Data Using Different Types of Structural Equation Modeling and Multilevel Software

    ERIC Educational Resources Information Center

    Ferrer, Emilio; Hamagami, Fumiaki; McArdle, John J.

    2004-01-01

    This article offers different examples of how to fit latent growth curve (LGC) models to longitudinal data using a variety of different software programs (i.e., LISREL, Mx, Mplus, AMOS, SAS). The article shows how the same model can be fitted using both structural equation modeling and multilevel software, with nearly identical results, even in…

  7. Engine structures modeling software system: Computer code. User's manual

    NASA Technical Reports Server (NTRS)

    1992-01-01

    ESMOSS is a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components and substructures which can be transferred to finite element analysis programs such as NASTRAN. The software architecture of ESMOSS is designed in modular form with a central executive module through which the user controls and directs the development of the analytical model. Modules consist of a geometric shape generator, a library of discretization procedures, interfacing modules to join both geometric and discrete models, a deck generator to produce input for NASTRAN and a 'recipe' processor which generates geometric models from parametric definitions. ESMOSS can be executed both in interactive and batch modes. Interactive mode is considered to be the default mode and that mode will be assumed in the discussion in this document unless stated otherwise.

  8. Modelling of the MEA float zone using accelerometer data

    NASA Technical Reports Server (NTRS)

    Alexander, J. Iwan D.

    1993-01-01

    During a floating zone experiment involving the growth of indium on a recent orbiter mission, (STS 32) oscillation of the zone shapes were observed to occur in response to the background acceleration. An understanding of the nature of the response of the zone shape to forced (g-jitter) oscillations and predictions of its impact on future experiments is of great interest not only to the PI's but to other commercial and academic investigators who plan to fly similar experiments in the orbiter and on space station. Motivated by this, a 15 month study was undertaken to analyze the nature of the g-sensitivity of the STS 32 floating zone crystal growth experiment. Numerical models were used to describe the time-dependent free surface motion of the zone as it responds to the spacecraft residual acceleration. Relevant experimental data concerning the acceleration environment was obtained from the Honeywell in Space Accelerometer (HISA) investigators through MSFC's ACAP program and processed and analyzed. For the indium floating zone experiment, a series of calculations were made using time-dependent axial accelerations g(t). The form of g(t) included simple sinusoidal disturbances as well as actual data (subject to appropriate filtering) measured on the STS 32 mission. Focus was on the calculation of the response of the free surface of the zone as well as the internal flows and internal heat transfer. The influence of solidification on the response of the zone shape was also examined but found to be negligible.

  9. A posteriori operation detection in evolving software models

    PubMed Central

    Langer, Philip; Wimmer, Manuel; Brosch, Petra; Herrmannsdörfer, Markus; Seidl, Martina; Wieland, Konrad; Kappel, Gerti

    2013-01-01

    As every software artifact, also software models are subject to continuous evolution. The operations applied between two successive versions of a model are crucial for understanding its evolution. Generic approaches for detecting operations a posteriori identify atomic operations, but neglect composite operations, such as refactorings, which leads to cluttered difference reports. To tackle this limitation, we present an orthogonal extension of existing atomic operation detection approaches for detecting also composite operations. Our approach searches for occurrences of composite operations within a set of detected atomic operations in a post-processing manner. One major benefit is the reuse of specifications available for executing composite operations also for detecting applications of them. We evaluate the accuracy of the approach in a real-world case study and investigate the scalability of our implementation in an experiment. PMID:23471366

  10. Deformation and stress change associated with plate interaction at subduction zones: a kinematic modelling

    NASA Astrophysics Data System (ADS)

    Zhao, Shaorong; Takemoto, Shuzo

    2000-08-01

    The interseismic deformation associated with plate coupling at a subduction zone is commonly simulated by the steady-slip model in which a reverse dip-slip is imposed on the down-dip extension of the locked plate interface, or by the backslip model in which a normal slip is imposed on the locked plate interface. It is found that these two models, although totally different in principle, produce similar patterns for the vertical deformation at a subduction zone. This suggests that it is almost impossible to distinguish between these two models by analysing only the interseismic vertical deformation observed at a subduction zone. The steady-slip model cannot correctly predict the horizontal deformation associated with plate coupling at a subduction zone, a fact that is proved by both the numerical modelling in this study and the GPS (Global Positioning System) observations near the Nankai trough, southwest Japan. It is therefore inadequate to simulate the effect of the plate coupling at a subduction zone by the steady-slip model. It is also revealed that the unphysical assumption inherent in the backslip model of imposing a normal slip on the locked plate interface makes it impossible to predict correctly the horizontal motion of the subducted plate and the stress change within the overthrust zone associated with the plate coupling during interseismic stages. If the analysis made in this work is proved to be correct, some of the previous studies on interpreting the interseismic deformation observed at several subduction zones based on these two models might need substantial revision. On the basis of the investigations on plate interaction at subduction zones made using the finite element method and the kinematic/mechanical conditions of the plate coupling implied by the present plate tectonics, a synthesized model is proposed to simulate the kinematic effect of the plate interaction during interseismic stages. A numerical analysis shows that the proposed model

  11. Artificial Intelligence Software Engineering (AISE) model

    NASA Technical Reports Server (NTRS)

    Kiss, Peter A.

    1990-01-01

    The American Institute of Aeronautics and Astronautics has initiated a committee on standards for Artificial Intelligence. Presented are the initial efforts of one of the working groups of that committee. A candidate model is presented for the development life cycle of knowledge based systems (KBSs). The intent is for the model to be used by the aerospace community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are shown and detailed as are the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.

  12. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  13. ModBack - simplified contaminant source zone delineation using backtracking

    NASA Astrophysics Data System (ADS)

    Thielsch, K.; Herold, M.; Ptak, T.

    2012-12-01

    Contaminated groundwater poses a serious threat to drinking water resources all over the world. Even though contaminated water might be detected in observation wells, a proper clean up is often only successful if the source of the contamination is detected and subsequently removed, contained or remediated. The high costs of groundwater remediation could be possibly significantly reduced if, from the outset, a focus is placed on source zone detection. ModBack combines several existing modelling tools in one easy to use GIS-based interface helping to delineate potential contaminant source zones in the subsurface. The software is written in Visual Basic 3.5 and uses the ArcObjects library to implement all required GIS applications. It can run without modification on any Microsoft Windows based PC with sufficient RAM and at least Microsoft .NET Framework 3.5. Using ModBack requires additional installation of the following software: Processing Modflow Pro 7.0, ModPath, CSTREAM (Bayer-Raich et al., 2003), Golden Software Surfer and Microsoft Excel. The graphical user interface of ModBack is separated into four blocks of procedures dealing with: data input, groundwater modelling, backtracking and analyses. Geographical data input includes all georeferenced information pertaining to the study site. Information on subsurface contamination is gathered either by conventional sampling of monitoring wells or by conducting integral pumping tests at control planes with a specific sampling scheme. Hydraulic data from these pumping tests together with all other available information are then used to set up a groundwater flow model of the study site, which provides the flow field for transport simulations within the subsequent contamination backtracking procedures, starting from the defined control planes. The backtracking results are then analysed within ModBack. The potential areas of contamination source presence or absence are determined based on the procedure used by Jarsjö et

  14. Surrogate Safety Assessment Model (SSAM)--software user manual

    DOT National Transportation Integrated Search

    2008-05-01

    This document presents guidelines for the installation and use of the Surrogate Safety Assessment Model (SSAM) software. For more information regarding the SSAM application, including discussion of theoretical background and the results of a series o...

  15. Microplate and shear zone models for oceanic spreading center reorganizations

    NASA Technical Reports Server (NTRS)

    Engeln, Joseph F.; Stein, Seth; Werner, John; Gordon, Richard

    1988-01-01

    The kinematics of rift propagation and the resulting goemetries of various tectonic elements for two plates is reviewed with no overlap zone. The formation and evolution of overlap regions using schematic models is discussed. The models are scaled in space and time to approximate the Easter plate, but are simplified to emphasize key elements. The tectonic evolution of overlap regions which act as rigid microplates and shear zones is discussed, and the use of relative motion and structural data to discriminate between the two types of models is investigated. The effect of propagation rate and rise time on the size, shape, and deformation of the overlap region is demonstrated.

  16. Computational Software for Fitting Seismic Data to Epidemic-Type Aftershock Sequence Models

    NASA Astrophysics Data System (ADS)

    Chu, A.

    2014-12-01

    Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work introduces software to implement two of ETAS models described in Ogata (1998). To find the Maximum-Likelihood Estimates (MLEs), my software provides estimates of the homogeneous background rate parameter and the temporal and spatial parameters that govern triggering effects by applying the Expectation-Maximization (EM) algorithm introduced in Veen and Schoenberg (2008). Despite other computer programs exist for similar data modeling purpose, using EM-algorithm has the benefits of stability and robustness (Veen and Schoenberg, 2008). Spatial shapes that are very long and narrow cause difficulties in optimization convergence and problems with flat or multi-modal log-likelihood functions encounter similar issues. My program uses a robust method to preset a parameter to overcome the non-convergence computational issue. In addition to model fitting, the software is equipped with useful tools for examining modeling fitting results, for example, visualization of estimated conditional intensity, and estimation of expected number of triggered aftershocks. A simulation generator is also given with flexible spatial shapes that may be defined by the user. This open-source software has a very simple user interface. The user may execute it on a local computer, and the program also has potential to be hosted online. Java language is used for the software's core computing part and an optional interface to the statistical package R is provided.

  17. Mapping the Habitable Zone of Exoplanets with a 2D Energy Balance Model

    NASA Astrophysics Data System (ADS)

    Moon, Nicole Taylor; Dr. Lisa Kaltenegger, Dr. Ramses Ramirez

    2018-01-01

    Traditionally, the habitable zone has been defined as the distance at which liquid water could exist on the surface of a rocky planet. However, different complexity models (simplified and fast:1D, and complex and time-intense:3D) models derive different boundaries for the habitable zone. The goal of this project was to test a new intermediate complexity 2D Energy Balance model, add a new ice albedo feedback mechanism, and derive the habitable zone boundaries. After completing this first project, we also studied how other feedback mechanisms, such as the presence of clouds and the carbonate-silicate cycle, effected the location of the habitable zone boundaries using this 2D model. This project was completed as part of a 2017 summer REU program hosted by Cornell's Center for Astrophysics and Plantary Sciecne and in partnership with the Carl Sagan Institute.

  18. Introduction to Financial Projection Models. Business Management Instructional Software.

    ERIC Educational Resources Information Center

    Pomeroy, Robert W., III

    This guidebook and teacher's guide accompany a personal computer software program and introduce the key elements of financial projection modeling to project the financial statements of an industrial enterprise. The student will then build a model on an electronic spreadsheet. The guidebook teaches the purpose of a financial model and the steps…

  19. Modelling Subduction Zone Magmatism Due to Hydraulic Fracture

    NASA Astrophysics Data System (ADS)

    Lawton, R.; Davies, J. H.

    2014-12-01

    The aim of this project is to test the hypothesis that subduction zone magmatism involves hydraulic fractures propagating from the oceanic crust to the mantle wedge source region (Davies, 1999). We aim to test this hypothesis by developing a numerical model of the process, and then comparing model outputs with observations. The hypothesis proposes that the water interconnects in the slab following an earthquake. If sufficient pressure develops a hydrofracture occurs. The hydrofracture will expand in the direction of the least compressive stress and propagate in the direction of the most compressive stress, which is out into the wedge. Therefore we can calculate the hydrofracture path and end-point, given the start location on the slab and the propagation distance. We can therefore predict where water is added to the mantle wedge. To take this further we have developed a thermal model of a subduction zone. The model uses a finite difference, marker-in-cell method to solve the heat equation (Gerya, 2010). The velocity field was prescribed using the analytical expression of cornerflow (Batchelor, 1967). The markers contained within the fixed grid are used to track the different compositions and their properties. The subduction zone thermal model was benchmarked (Van Keken, 2008). We used the hydrous melting parameterization of Katz et.al., (2003) to calculate the degree of melting caused by the addition of water to the wedge. We investigate models where the hydrofractures, with properties constrained by estimated water fluxes, have random end points. The model predicts degree of melting, magma productivity, temperature of the melt and water content in the melt for different initial water fluxes. Future models will also include the buoyancy effect of the melt and residue. Batchelor, Cambridge UP, 1967. Davies, Nature, 398: 142-145, 1999. Gerya, Cambridge UP, 2010. Katz, Geochem. Geophys. Geosy, 4(9), 2003 Van Keken et.al. Phys. Earth. Planet. In., 171:187-197, 2008.

  20. Regulation of Ion Gradients across Myocardial Ischemic Border Zones: A Biophysical Modelling Analysis

    PubMed Central

    Niederer, Steven

    2013-01-01

    The myocardial ischemic border zone is associated with the initiation and sustenance of arrhythmias. The profile of ionic concentrations across the border zone play a significant role in determining cellular electrophysiology and conductivity, yet their spatial-temporal evolution and regulation are not well understood. To investigate the changes in ion concentrations that regulate cellular electrophysiology, a mathematical model of ion movement in the intra and extracellular space in the presence of ionic, potential and material property heterogeneities was developed. The model simulates the spatial and temporal evolution of concentrations of potassium, sodium, chloride, calcium, hydrogen and bicarbonate ions and carbon dioxide across an ischemic border zone. Ischemia was simulated by sodium-potassium pump inhibition, potassium channel activation and respiratory and metabolic acidosis. The model predicted significant disparities in the width of the border zone for each ionic species, with intracellular sodium and extracellular potassium having discordant gradients, facilitating multiple gradients in cellular properties across the border zone. Extracellular potassium was found to have the largest border zone and this was attributed to the voltage dependence of the potassium channels. The model also predicted the efflux of from the ischemic region due to electrogenic drift and diffusion within the intra and extracellular space, respectively, which contributed to depletion in the ischemic region. PMID:23577101

  1. Assessment of the geothermal potential of fault zones in Germany by numerical modelling

    NASA Astrophysics Data System (ADS)

    Kuder, Jörg

    2017-04-01

    Fault zones with significantly better permeabilities than host rocks can act as natural migration paths for ascending fluids that are able to transport thermal energy from deep geological formations. Under these circumstances, fault zones are interesting for geothermal utilization especially those in at least 7 km depth (Jung et al. 2002, Paschen et al. 2003). One objective of the joint project "The role of deep rooting fault zones for geothermal energy utilization" supported by the Federal Ministry for Economic Affairs and Energy was the evaluation of the geothermal potential of fault zones in Germany by means of numerical modelling with COMSOL. To achieve this goal a method was developed to estimate the potential of regional generalized fault zones in a simple but yet sophisticated way. The main problem for the development of a numerical model is the lack of geological and hydrological data. To address this problem the geothermal potential of a cube with 1 km side length including a 20 meter broad, 1000 m high and 1000 m long fault zone was calculated as a unified model with changing parameter sets. The properties of the surrounding host rock and the fault zone are assumed homogenous. The numerical models were calculated with a broad variety of fluid flow, rock and fluid property parameters for the depths of 3000-4000 m, 4000-5000 m, 5000-6000 m and 6000-7000 m. The fluid parameters are depending on temperature, salt load and initial pressure. The porosity and permeability values are provided by the database of the geothermal information system (GeotIS). The results are summarized in a table of values of geothermal energy modelled with different parameter sets and depths. The geothermal potential of fault zones in Germany was then calculated on the basis of this table and information of the geothermal atlas of Germany (2016).

  2. Viscoelastic shear zone model of a strike-slip earthquake cycle

    USGS Publications Warehouse

    Pollitz, F.F.

    2001-01-01

    I examine the behavior of a two-dimensional (2-D) strike-slip fault system embedded in a 1-D elastic layer (schizosphere) overlying a uniform viscoelastic half-space (plastosphere) and within the boundaries of a finite width shear zone. The viscoelastic coupling model of Savage and Prescott [1978] considers the viscoelastic response of this system, in the absence of the shear zone boundaries, to an earthquake occurring within the upper elastic layer, steady slip beneath a prescribed depth, and the superposition of the responses of multiple earthquakes with characteristic slip occurring at regular intervals. So formulated, the viscoelastic coupling model predicts that sufficiently long after initiation of the system, (1) average fault-parallel velocity at any point is the average slip rate of that side of the fault and (2) far-field velocities equal the same constant rate. Because of the sensitivity to the mechanical properties of the schizosphere-plastosphere system (i.e., elastic layer thickness, plastosphere viscosity), this model has been used to infer such properties from measurements of interseismic velocity. Such inferences exploit the predicted behavior at a known time within the earthquake cycle. By modifying the viscoelastic coupling model to satisfy the additional constraint that the absolute velocity at prescribed shear zone boundaries is constant, I find that even though the time-averaged behavior remains the same, the spatiotemporal pattern of surface deformation (particularly its temporal variation within an earthquake cycle) is markedly different from that predicted by the conventional viscoelastic coupling model. These differences are magnified as plastosphere viscosity is reduced or as the recurrence interval of periodic earthquakes is lengthened. Application to the interseismic velocity field along the Mojave section of the San Andreas fault suggests that the region behaves mechanically like a ???600-km-wide shear zone accommodating 50 mm/yr fault

  3. pyLIMA : The first open source microlensing modeling software

    NASA Astrophysics Data System (ADS)

    Bachelet, Etienne; Street, Rachel; Bozza, Valerio

    2018-01-01

    Microlensing is highly sensitive to planets beyond the snowline and distributed along the line of sight towards the Galactic Bulge. The WFIRST-AFTA mission should detect about 3000 of these planets and significantly improves our knowledge of planet formation and statistics, complementing results found by transit and radial velocity methods. However, the modeling of microlensing event is challenging on different aspects leading to a highly time consuming analysis. After a quick summarize of these different challenges, I will present pyLIMA, the first open source microlensing modeling software. The aimed goal of this software are to be flexible, powerful and user friendly. This presentation will focus on various case and early results.

  4. Weak ductile shear zone beneath the western North Anatolian Fault Zone: inferences from earthquake cycle model constrained by geodetic observations

    NASA Astrophysics Data System (ADS)

    Yamasaki, T.; Wright, T. J.; Houseman, G. A.

    2013-12-01

    After large earthquakes, rapid postseismic transient motions are commonly observed. Later in the loading cycle, strain is typically focused in narrow regions around the fault. In simple two-layer models of the loading cycle for strike-slip faults, rapid post-seismic transients require low viscosities beneath the elastic layer, but localized strain later in the cycle implies high viscosities in the crust. To explain this apparent paradox, complex transient rheologies have been invoked. Here we test an alternative hypothesis in which spatial variations in material properties of the crust can explain the geodetic observations. We use a 3D viscoelastic finite element code to examine two simple models of periodic fault slip: a stratified model in which crustal viscosity decreases exponentially with depth below an upper elastic layer, and a block model in which a low viscosity domain centered beneath the fault is embedded in a higher viscosity background representing normal crust. We test these models using GPS data acquired before and after the 1999 Izmit/Duzce earthquakes on the North Anatolian Fault Zone (Turkey). The model with depth-dependent viscosity can show both high postseismic velocities, and preseismic localization of the deformation, if the viscosity contrast from top to bottom of layer exceeds a factor of about 104. However, with no lateral variations in viscosity, this model cannot explain the proximity to the fault of maximum postseismic velocities. In contrast, the model which includes a localized weak zone beneath the faulted elastic lid can explain all the observations, if the weak zone extends down to mid-crustal levels and outward to 10 or 20 km from the fault. The non-dimensional ratio of relaxation time to earthquake repeat time, τ/Δt, is the critical parameter in controlling the observed deformation. In the weak-zone model, τ/Δt should be in the range 0.005 to 0.01 in the weak domain, and larger than ~ 1.0 elsewhere. This implies a viscosity

  5. Discrete Address Beacon System (DABS) Software System Reliability Modeling and Prediction.

    DTIC Science & Technology

    1981-06-01

    Service ( ATARS ) module because of its interim status. Reliability prediction models for software modules were derived and then verified by matching...System (A’iCR3BS) and thus can be introduced gradually and economically without ma jor olper- ational or procedural change. Since DABS uses monopulse...lineanaly- sis tools or are ured during maintenance or pre-initialization were not modeled because they are not part of the mission software. The ATARS

  6. Quantitative software models for the estimation of cost, size, and defects

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Bright, L.; Decker, B.; Lum, K.; Mikulski, C.; Powell, J.

    2002-01-01

    The presentation will provide a brief overview of the SQI measurement program as well as describe each of these models and how they are currently being used in supporting JPL project, task and software managers to estimate and plan future software systems and subsystems.

  7. Reaction Wheel Disturbance Model Extraction Software - RWDMES

    NASA Technical Reports Server (NTRS)

    Blaurock, Carl

    2009-01-01

    The RWDMES is a tool for modeling the disturbances imparted on spacecraft by spinning reaction wheels. Reaction wheels are usually the largest disturbance source on a precision pointing spacecraft, and can be the dominating source of pointing error. Accurate knowledge of the disturbance environment is critical to accurate prediction of the pointing performance. In the past, it has been difficult to extract an accurate wheel disturbance model since the forcing mechanisms are difficult to model physically, and the forcing amplitudes are filtered by the dynamics of the reaction wheel. RWDMES captures the wheel-induced disturbances using a hybrid physical/empirical model that is extracted directly from measured forcing data. The empirical models capture the tonal forces that occur at harmonics of the spin rate, and the broadband forces that arise from random effects. The empirical forcing functions are filtered by a physical model of the wheel structure that includes spin-rate-dependent moments (gyroscopic terms). The resulting hybrid model creates a highly accurate prediction of wheel-induced forces. It accounts for variation in disturbance frequency, as well as the shifts in structural amplification by the whirl modes, as the spin rate changes. This software provides a point-and-click environment for producing accurate models with minimal user effort. Where conventional approaches may take weeks to produce a model of variable quality, RWDMES can create a demonstrably high accuracy model in two hours. The software consists of a graphical user interface (GUI) that enables the user to specify all analysis parameters, to evaluate analysis results and to iteratively refine the model. Underlying algorithms automatically extract disturbance harmonics, initialize and tune harmonic models, and initialize and tune broadband noise models. The component steps are described in the RWDMES user s guide and include: converting time domain data to waterfall PSDs (power spectral

  8. A Bayesian modification to the Jelinski-Moranda software reliability growth model

    NASA Technical Reports Server (NTRS)

    Littlewood, B.; Sofer, A.

    1983-01-01

    The Jelinski-Moranda (JM) model for software reliability was examined. It is suggested that a major reason for the poor results given by this model is the poor performance of the maximum likelihood method (ML) of parameter estimation. A reparameterization and Bayesian analysis, involving a slight modelling change, are proposed. It is shown that this new Bayesian-Jelinski-Moranda model (BJM) is mathematically quite tractable, and several metrics of interest to practitioners are obtained. The BJM and JM models are compared by using several sets of real software failure data collected and in all cases the BJM model gives superior reliability predictions. A change in the assumption which underlay both models to present the debugging process more accurately is discussed.

  9. 2016 KIVA-hpFE Development: A Robust and Accurate Engine Modeling Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carrington, David Bradley; Waters, Jiajia

    Los Alamos National Laboratory and its collaborators are facilitating engine modeling by improving accuracy and robustness of the modeling, and improving the robustness of software. We also continue to improve the physical modeling methods. We are developing and implementing new mathematical algorithms, those that represent the physics within an engine. We provide software that others may use directly or that they may alter with various models e.g., sophisticated chemical kinetics, different turbulent closure methods or other fuel injection and spray systems.

  10. Software Metrics

    DTIC Science & Technology

    1988-12-01

    software development scene is often charac- c. SPQR Model-Jones terized by: * schedule and cost estimates that are gross-d. COPMO-Thebaut ly inaccurate, SEI...time c. SPQR Model-Jones (in seconds) is simply derived from E by dividing T. Capers Jones has developed a software cost by the Stroud number, S...estimation model called the Software Produc- T=E/S tivity, Quality, and Reliability ( SPQR ) model. The basic approach is similar to that of Boehm’s The value

  11. Definition of zones with different levels of productivity within an agricultural field using fuzzy modeling

    USDA-ARS?s Scientific Manuscript database

    Zoning of agricultural fields is an important task for utilization of precision farming technology. One method for the definition of zones with different levels of productivity is based on fuzzy indicator model. Fuzzy indicator model for identification of zones with different levels of productivit...

  12. Gravity modelling of the Hellenic subduction zone — a regional study

    NASA Astrophysics Data System (ADS)

    Casten, U.; Snopek, K.

    2006-05-01

    The Hellenic subduction zone is clearly expressed in the arc-shaped distribution of earthquake epicenters and gravity anomalies, which connect the Peloponnesos with Crete and Anatolia. In this region, oceanic crust of the African plate collides northward with continental crust of the Aegean microplate, which itself is pushed apart to the south-west by the Anatolian plate and, at the same time, is characterised by crustal extension. The result is an overall collision rate of up to 4 cm/year and a retreating subduction process. Recent passive and active seismic studies on and around Crete gave first, but not in all details consistent, structural results useful for supporting gravity modelling. This was undertaken with the aim of presenting the first 3D density structure of the entire subduction zone. Gravity interpretation was based on a Bouguer map, newly compiled using data from land, marine and satellite sources. The anomalies range from + 170 mGal (Cretan Sea) to - 10 mGal (Mediterranean Ridge). 3D gravity modelling was done applying the modelling software IGMAS. The computed Bouguer map fits the low frequency part of the observed one, which is controlled by variations in Moho depth (less than 20 km below the Cretan Sea and extending 30 km below Crete) and the extremely thick sedimentary cover (partly up to 18 km) of the Mediterranean Ridge. The southernmost edge of the Eurasian plate, with its more triangular-shaped backstop area, was traced south off Crete. Only 50 to 100 km further to the south, the edge of the African continent was traced as well. In between these boundaries there is African oceanic crust, which has a clear arc-shaped detachment line situated at the Eurasian continental edge. The subduction arc is open towards the north, its slab separates hotter mantle material (lower density) below the updoming Moho of the Cretan Sea from colder one (higher density) in the south. Subjacent to the upper continental crust of Crete is a thickened layer of

  13. Data reduction of room tests for zone model validation

    Treesearch

    M. Janssens; H. C. Tran

    1992-01-01

    Compartment fire zone models are based on many simplifying assumptions, in particular that gases stratify in two distinct layers. Because of these assumptions, certain model output is in a form unsuitable for direct comparison to measurements made in full-scale room tests. The experimental data must first be reduced and transformed to be compatible with the model...

  14. Do Over or Make Do? Climate Models as a Software Development Challenge (Invited)

    NASA Astrophysics Data System (ADS)

    Easterbrook, S. M.

    2010-12-01

    We present the results of a comparative study of the software engineering culture and practices at four different earth system modeling centers: the UK Met Office Hadley Centre, the National Center for Atmospheric Research (NCAR), The Max-Planck-Institut für Meteorologie (MPI-M), and the Institut Pierre Simon Laplace (IPSL). The study investigated the software tools and techniques used at each center to assess their effectiveness. We also investigated how differences in the organizational structures, collaborative relationships, and technical infrastructures constrain the software development and affect software quality. Specific questions for the study included 1) Verification and Validation - What techniques are used to ensure that the code matches the scientists’ understanding of what it should do? How effective are these are at eliminating errors of correctness and errors of understanding? 2) Coordination - How are the contributions from across the modeling community coordinated? For coupled models, how are the differences in the priorities of different, overlapping communities of users addressed? 3) Division of responsibility - How are the responsibilities for coding, verification, and coordination distributed between different roles (scientific, engineering, support) in the organization? 4) Planning and release processes - How do modelers decide on priorities for model development, how do they decide which changes to tackle in a particular release of the model? 5) Debugging - How do scientists debug the models, what types of bugs do they find in their code, and how they find them? The results show that each center has evolved a set of model development practices that are tailored to their needs and organizational constraints. These practices emphasize scientific validity, but tend to neglect other software qualities, and all the centers struggle frequently with software problems. The testing processes are effective at removing software errors prior to

  15. An Overview of Software for Conducting Dimensionality Assessment in Multidimensional Models

    ERIC Educational Resources Information Center

    Svetina, Dubravka; Levy, Roy

    2012-01-01

    An overview of popular software packages for conducting dimensionality assessment in multidimensional models is presented. Specifically, five popular software packages are described in terms of their capabilities to conduct dimensionality assessment with respect to the nature of analysis (exploratory or confirmatory), types of data (dichotomous,…

  16. Modeling Degradation Product Partitioning in Chlorinated-DNAPL Source Zones

    NASA Astrophysics Data System (ADS)

    Boroumand, A.; Ramsburg, A.; Christ, J.; Abriola, L.

    2009-12-01

    Metabolic reductive dechlorination degrades aqueous phase contaminant concentrations, increasing the driving force for DNAPL dissolution. Results from laboratory and field investigations suggest that accumulation of cis-dichloroethene (cis-DCE) and vinyl chloride (VC) may occur within DNAPL source zones. The lack of (or slow) degradation of cis-DCE and VC within bioactive DNAPL source zones may result in these dechlorination products becoming distributed among the solid, aqueous, and organic phases. Partitioning of cis-DCE and VC into the organic phase may reduce aqueous phase concentrations of these contaminants and result in the enrichment of these dechlorination products within the non-aqueous phase. Enrichment of degradation products within DNAPL may reduce some of the advantages associated with the application of bioremediation in DNAPL source zones. Thus, it is important to quantify how partitioning (between the aqueous and organic phases) influences the transport of cis-DCE and VC within bioactive DNAPL source zones. In this work, abiotic two-phase (PCE-water) one-dimensional column experiments are modeled using analytical and numerical methods to examine the rate of partitioning and the capacity of PCE-DNAPL to reversibly sequester cis-DCE. These models consider aqueous-phase, nonaqueous phase, and aqueous plus nonaqueous phase mass transfer resistance using linear driving force and spherical diffusion expressions. Model parameters are examined and compared for different experimental conditions to evaluate the mechanisms controlling partitioning. Biot number, a dimensionless number which is an index of the ratio of the aqueous phase mass transfer rate in boundary layer to the mass transfer rate within the NAPL, is used to characterize conditions in which either or both processes are controlling. Results show that application of a single aqueous resistance is capable to capture breakthrough curves when DNAPL is distributed in porous media as low

  17. New Results in Software Model Checking and Analysis

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.

    2010-01-01

    This introductory article surveys new techniques, supported by automated tools, for the analysis of software to ensure reliability and safety. Special focus is on model checking techniques. The article also introduces the five papers that are enclosed in this special journal volume.

  18. Software Engineering Program: Software Process Improvement Guidebook

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  19. The "neuro-mapping locator" software. A real-time intraoperative objective paraesthesia mapping tool to evaluate paraesthesia coverage of the painful zone in patients undergoing spinal cord stimulation lead implantation.

    PubMed

    Guetarni, F; Rigoard, P

    2015-03-01

    Conventional spinal cord stimulation (SCS) generates paraesthesia, as the efficacy of this technique is based on the relationship between the paraesthesia provided by SCS on the painful zone and an analgesic effect on the stimulated zone. Although this basic postulate is based on clinical evidence, it is clear that this relationship has never been formally demonstrated by scientific studies. There is a need for objective evaluation tools ("transducers") to transpose electrical signals to clinical effects and to guide therapeutic choices. We have developed a software at Poitiers University hospital allowing real-time objective mapping of the paraesthesia generated by SCS lead placement and programming during the implantation procedure itself, on a touch screen interface. The purpose of this article is to describe this intraoperative mapping software, in terms of its concept and technical aspects. The Neuro-Mapping Locator (NML) software is dedicated to patients with failed back surgery syndrome, candidates for SCS lead implantation, to actively participate in the implantation procedure. Real-time geographical localization of the paraesthesia generated by percutaneous or multicolumn surgical SCS lead implanted under awake anaesthesia allows intraoperative lead programming and possibly lead positioning to be modified with the patient's cooperation. Software updates should enable us to refine objectives related to the use of this tool and minimize observational biases. The ultimate goals of NML software should not be limited to optimize one specific device implantation in a patient but also allow to compare instantaneously various stimulation strategies, by characterizing new technical parameters as "coverage efficacy" and "device specificity" on selected subgroups of patients. Another longer-term objective would be to organize these predictive factors into computer science ontologies, which could constitute robust and helpful data for device selection and programming

  20. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    PubMed

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.

  1. Generalized Pseudo-Reaction Zone Model for Non-Ideal Explosives

    NASA Astrophysics Data System (ADS)

    Wescott, B. L.

    2007-12-01

    The pseudo-reaction zone model was proposed to improve engineering scale simulations with high explosives that have a slow reaction component. In this work an extension of the pseudo-reaction zone model is developed for non-ideal explosives that propagate well below the steady-planar Chapman-Jouguet velocity. A programmed burn method utilizing Detonation Shock Dynamics (DSD) and a detonation velocity dependent pseudo-reaction rate has been developed for non-ideal explosives and applied to the explosive mixture of ammonium nitrate and fuel oil (ANFO). The pseudo-reaction rate is calibrated to the experimentally obtained normal detonation velocity—shock curvature relation. Cylinder test simulations predict the proper expansion to within 1% even though significant reaction occurs as the cylinder expands.

  2. Implementing a modeling software for animated protein-complex interactions using a physics simulation library.

    PubMed

    Ueno, Yutaka; Ito, Shuntaro; Konagaya, Akihiko

    2014-12-01

    To better understand the behaviors and structural dynamics of proteins within a cell, novel software tools are being developed that can create molecular animations based on the findings of structural biology. This study proposes our method developed based on our prototypes to detect collisions and examine the soft-body dynamics of molecular models. The code was implemented with a software development toolkit for rigid-body dynamics simulation and a three-dimensional graphics library. The essential functions of the target software system included the basic molecular modeling environment, collision detection in the molecular models, and physical simulations of the movement of the model. Taking advantage of recent software technologies such as physics simulation modules and interpreted scripting language, the functions required for accurate and meaningful molecular animation were implemented efficiently.

  3. The prediction of radiofrequency ablation zone volume using vascular indices of 3-dimensional volumetric colour Doppler ultrasound in an in vitro blood-perfused bovine liver model

    PubMed Central

    Lanctot, Anthony C; McCarter, Martin D; Roberts, Katherine M; Glueck, Deborah H; Dodd, Gerald D

    2017-01-01

    Objective: To determine the most reliable predictor of radiofrequency (RF) ablation zone volume among three-dimensional (3D) volumetric colour Doppler vascular indices in an in vitro blood-perfused bovine liver model. Methods: 3D colour Doppler volume data of the local hepatic parenchyma were acquired from 37 areas of 13 bovine livers connected to an in vitro oxygenated blood perfusion system. Doppler vascular indices of vascularization index (VI), flow index (FI) and vascularization flow index (VFI) were obtained from the volume data using 3D volume analysis software. 37 RF ablations were performed at the same locations where the ultrasound data were obtained from. The relationship of these vascular indices and the ablation zone volumes measured from gross specimens were analyzed using a general linear mixed model fit with random effect for liver and backward stepwise regression analysis. Results: FI was significantly associated with ablation zone volumes measured on gross specimens (p = 0.0047), but explained little of the variance (Rβ2 = 0.21). Ablation zone volume decreased by 0.23 cm3 (95% confidence interval: −0.38, −0.08) for every 1 increase in FI. Neither VI nor VFI was significantly associated with ablation zone volumes (p > 0.05). Conclusion: Although FI was associated with ablation zone volumes, it could not sufficiently explain their variability, limiting its clinical applicability. VI, FI and VFI are not clinically useful in the prediction of RF ablation zone volume in the liver. Advances in knowledge: Despite a significant association of FI with ablation zone volumes, VI, FI and VFI cannot be used for their prediction. Different Doppler vascular indices need to be investigated for clinical use. PMID:27925468

  4. Shear heating and metamorphism in subduction zones, 1. Thermal models

    NASA Astrophysics Data System (ADS)

    Kohn, M. J.; Castro, A. E.; Spear, F. S.

    2017-12-01

    Popular thermal-mechanical models of modern subduction systems are 100-500 °C colder at c. 50 km depth than pressure-temperature (P-T) conditions determined from exhumed metamorphic rocks. This discrepancy has been ascribed by some to profound bias in the rock record, i.e. metamorphic rocks reflect only anomalously warm subduction, not normal subduction. Accurately inferring subduction zone thermal structure, whether from models or rocks, is crucial for predicting depths of seismicity, fluid release, and sub-arc melting conditions. Here, we show that adding realistic shear stresses to thermal models implies P-T conditions quantitatively consistent with those recorded by exhumed metamorphic rocks, suggesting that metamorphic rock P-T conditions are not anomalously warm. Heat flow measurements from subduction zone fore-arcs typically indicate effective coefficients of friction (µ) ranging from 0.025 to 0.1. We included these coefficients of friction in analytical models of subduction zone interface temperatures. Using global averages of subducting plate age (50 Ma), subduction velocity (6 cm/yr), and subducting plate geometry (central Chile), temperatures at 50 km depth (1.5 GPa) increase by c. 200 °C for µ=0.025 to 700 °C for µ=0.1. However, at high temperatures, thermal softening will reduce frictional heating, and temperatures will not increase as much with depth. Including initial weakening of materials ranging from wet quartz (c. 300 °C) to diabase (c. 600 °C) in the analytical models produces concave-upward P-T distributions on P-T diagrams, with temperatures c. 100 to 500 °C higher than models with no shear heating. The absolute P-T conditions and concave-upward shape of the shear-heating + thermal softening models almost perfectly matches the distribution of P-T conditions derived from a compilation of exhumed metamorphic rocks. Numerical models of modern subduction zones that include shear heating also overlap metamorphic data. Thus, excepting the

  5. Implications of different digital elevation models and preprocessing techniques to delineate debris flow inundation hazard zones in El Salvador

    NASA Astrophysics Data System (ADS)

    Anderson, E. R.; Griffin, R.; Irwin, D.

    2013-12-01

    Heavy rains and steep, volcanic slopes in El Salvador cause numerous landslides every year, posing a persistent threat to the population, economy and environment. Although potential debris inundation hazard zones have been delineated using digital elevation models (DEMs), some disparities exist between the simulated zones and actual affected areas. Moreover, these hazard zones have only been identified for volcanic lahars and not the shallow landslides that occur nearly every year. This is despite the availability of tools to delineate a variety of landslide types (e.g., the USGS-developed LAHARZ software). Limitations in DEM spatial resolution, age of the data, and hydrological preprocessing techniques can contribute to inaccurate hazard zone definitions. This study investigates the impacts of using different elevation models and pit filling techniques in the final debris hazard zone delineations, in an effort to determine which combination of methods most closely agrees with observed landslide events. In particular, a national DEM digitized from topographic sheets from the 1970s and 1980s provide an elevation product at a 10 meter resolution. Both natural and anthropogenic modifications of the terrain limit the accuracy of current landslide hazard assessments derived from this source. Global products from the Shuttle Radar Topography Mission (SRTM) and the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global DEM (ASTER GDEM) offer more recent data but at the cost of spatial resolution. New data derived from the NASA Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) in 2013 provides the opportunity to update hazard zones at a higher spatial resolution (approximately 6 meters). Hydrological filling of sinks or pits for current hazard zone simulation has previously been achieved through ArcInfo spatial analyst. Such hydrological processing typically only fills pits and can lead to drastic modifications of original elevation values

  6. Hybrid Modeling for Testing Intelligent Software for Lunar-Mars Closed Life Support

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Nicholson, Leonard S. (Technical Monitor)

    1999-01-01

    Intelligent software is being developed for closed life support systems with biological components, for human exploration of the Moon and Mars. The intelligent software functions include planning/scheduling, reactive discrete control and sequencing, management of continuous control, and fault detection, diagnosis, and management of failures and errors. Four types of modeling information have been essential to system modeling and simulation to develop and test the software and to provide operational model-based what-if analyses: discrete component operational and failure modes; continuous dynamic performance within component modes, modeled qualitatively or quantitatively; configuration of flows and power among components in the system; and operations activities and scenarios. CONFIG, a multi-purpose discrete event simulation tool that integrates all four types of models for use throughout the engineering and operations life cycle, has been used to model components and systems involved in the production and transfer of oxygen and carbon dioxide in a plant-growth chamber and between that chamber and a habitation chamber with physicochemical systems for gas processing.

  7. New reductions of the Astrographic Catalogue. Plate adjustments of the Algiers, Oxford I and II, and Vatican Zones.

    NASA Astrophysics Data System (ADS)

    Urban, S. E.; Martin, J. C.; Jackson, E. S.; Corbin, T. E.

    1996-07-01

    The U. S. Naval Observatory is in the process of making new reductions of the Astrographic Catalogue using a modern reference catalog, the ACRS, and new data analysis and reduction software. Currently ten AC zones have been reduced. This papers discusses the reduction models and results from the Algiers, Oxford I and II, and Vatican zones (those of the Cape zone are discussed elsewhere). The resulting star positions will be combined with those of the U.S. Naval Observatory's Twin Astrograph Catalog to produce a catalog of positions and proper motions in support of the Sloan Digital Sky Survey.

  8. Creep model of unsaturated sliding zone soils and long-term deformation analysis of landslides

    NASA Astrophysics Data System (ADS)

    Zou, Liangchao; Wang, Shimei; Zhang, Yeming

    2015-04-01

    Sliding zone soil is a special soil layer formed in the development of a landslide. Its creep behavior plays a significant role in long-term deformation of landslides. Due to rainfall infiltration and reservoir water level fluctuation, the soils in the slide zone are often in unsaturated state. Therefore, the investigation of creep behaviors of the unsaturated sliding zone soils is of great importance for understanding the mechanism of the long-term deformation of a landslide in reservoir areas. In this study, the full-process creep curves of the unsaturated soils in the sliding zone in different net confining pressure, matric suctions and stress levels were obtained from a large number of laboratory triaxial creep tests. A nonlinear creep model for unsaturated soils and its three-dimensional form was then deduced based on the component model theory and unsaturated soil mechanics. This creep model was validated with laboratory creep data. The results show that this creep model can effectively and accurately describe the nonlinear creep behaviors of the unsaturated sliding zone soils. In order to apply this creep model to predict the long-term deformation process of landslides, a numerical model for simulating the coupled seepage and creep deformation of unsaturated sliding zone soils was developed based on this creep model through the finite element method (FEM). By using this numerical model, we simulated the deformation process of the Shuping landslide located in the Three Gorges reservoir area, under the cycling reservoir water level fluctuation during one year. The simulation results of creep displacement were then compared with the field deformation monitoring data, showing a good agreement in trend. The results show that the creeping deformations of landslides have strong connections with the changes of reservoir water level. The creep model of unsaturated sliding zone soils and the findings obtained by numerical simulations in this study are conducive to

  9. Using UML Modeling to Facilitate Three-Tier Architecture Projects in Software Engineering Courses

    ERIC Educational Resources Information Center

    Mitra, Sandeep

    2014-01-01

    This article presents the use of a model-centric approach to facilitate software development projects conforming to the three-tier architecture in undergraduate software engineering courses. Many instructors intend that such projects create software applications for use by real-world customers. While it is important that the first version of these…

  10. End-to-end observatory software modeling using domain specific languages

    NASA Astrophysics Data System (ADS)

    Filgueira, José M.; Bec, Matthieu; Liu, Ning; Peng, Chien; Soto, José

    2014-07-01

    The Giant Magellan Telescope (GMT) is a 25-meter extremely large telescope that is being built by an international consortium of universities and research institutions. Its software and control system is being developed using a set of Domain Specific Languages (DSL) that supports a model driven development methodology integrated with an Agile management process. This approach promotes the use of standardized models that capture the component architecture of the system, that facilitate the construction of technical specifications in a uniform way, that facilitate communication between developers and domain experts and that provide a framework to ensure the successful integration of the software subsystems developed by the GMT partner institutions.

  11. Software Engineering Laboratory (SEL) relationships, models, and management rules

    NASA Technical Reports Server (NTRS)

    Decker, William; Hendrick, Robert; Valett, Jon D.

    1991-01-01

    Over 50 individual Software Engineering Laboratory (SEL) research results, extracted from a review of published SEL documentation, that can be applied directly to managing software development projects are captured. Four basic categories of results are defined and discussed - environment profiles, relationships, models, and management rules. In each category, research results are presented as a single page that summarizes the individual result, lists potential uses of the result by managers, and references the original SEL documentation where the result was found. The document serves as a concise reference summary of applicable research for SEL managers.

  12. Analytics For Distracted Driver Behavior Modeling in Dilemma Zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jan-Mou; Malikopoulos, Andreas; Thakur, Gautam

    2014-01-01

    In this paper, we present the results obtained and insights gained through the analysis of TRB contest data. We used exploratory analysis, regression, and clustering models for gaining insights into the driver behavior in a dilemma zone while driving under distraction. While simple exploratory analysis showed the distinguishing driver behavior patterns among different popu- lation groups in the dilemma zone, regression analysis showed statically signification relationships between groups of variables. In addition to analyzing the contest data, we have also looked into the possible impact of distracted driving on the fuel economy.

  13. Modeling Physical Systems Using Vensim PLE Systems Dynamics Software

    NASA Astrophysics Data System (ADS)

    Widmark, Stephen

    2012-02-01

    Many physical systems are described by time-dependent differential equations or systems of such equations. This makes it difficult for students in an introductory physics class to solve many real-world problems since these students typically have little or no experience with this kind of mathematics. In my high school physics classes, I address this problem by having my students use a variety of software solutions to model physical systems described by differential equations. These include spreadsheets, applets, software my students themselves create, and systems dynamics software. For the latter, cost is often the main issue in choosing a solution for use in a public school and so I researched no-cost software. I found Sphinx SD,2OptiSim,3 Systems Dynamics,4 Simile (Trial Edition),5 and Vensim PLE.6 In evaluating each of these solutions, I looked for the fewest restrictions in the license for educational use, ease of use by students, power, and versatility. In my opinion, Vensim PLE best fulfills these criteria.7

  14. Identifying Developmental Zones in Maize Lateral Root Cell Length Profiles using Multiple Change-Point Models

    PubMed Central

    Moreno-Ortega, Beatriz; Fort, Guillaume; Muller, Bertrand; Guédon, Yann

    2017-01-01

    The identification of the limits between the cell division, elongation and mature zones in the root apex is still a matter of controversy when methods based on cellular features, molecular markers or kinematics are compared while methods based on cell length profiles have been comparatively underexplored. Segmentation models were developed to identify developmental zones within a root apex on the basis of epidermal cell length profiles. Heteroscedastic piecewise linear models were estimated for maize lateral roots of various lengths of both wild type and two mutants affected in auxin signaling (rtcs and rum-1). The outputs of these individual root analyses combined with morphological features (first root hair position and root diameter) were then globally analyzed using principal component analysis. Three zones corresponding to the division zone, the elongation zone and the mature zone were identified in most lateral roots while division zone and sometimes elongation zone were missing in arrested roots. Our results are consistent with an auxin-dependent coordination between cell flux, cell elongation and cell differentiation. The proposed segmentation models could extend our knowledge of developmental regulations in longitudinally organized plant organs such as roots, monocot leaves or internodes. PMID:29123533

  15. A Compatible Hardware/Software Reliability Prediction Model.

    DTIC Science & Technology

    1981-07-22

    machines. In particular, he was interested in the following problem: assu me that one has a collection of connected elements computing and transmitting...software reliability prediction model is desirable, the findings about the Weibull distribution are intriguing. After collecting failure data from several...capacitor, some of the added charge carriers are collected by the capacitor. If the added charge is sufficiently large, the information stored is changed

  16. Modeling CANDU-6 liquid zone controllers for effects of thorium-based fuels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    St-Aubin, E.; Marleau, G.

    2012-07-01

    We use the DRAGON code to model the CANDU-6 liquid zone controllers and evaluate the effects of thorium-based fuels on their incremental cross sections and reactivity worth. We optimize both the numerical quadrature and spatial discretization for 2D cell models in order to provide accurate fuel properties for 3D liquid zone controller supercell models. We propose a low computer cost parameterized pseudo-exact 3D cluster geometries modeling approach that avoids tracking issues on small external surfaces. This methodology provides consistent incremental cross sections and reactivity worths when the thickness of the buffer region is reduced. When compared with an approximate annularmore » geometry representation of the fuel and coolant region, we observe that the cluster description of fuel bundles in the supercell models does not increase considerably the precision of the results while increasing substantially the CPU time. In addition, this comparison shows that it is imperative to finely describe the liquid zone controller geometry since it has a strong impact of the incremental cross sections. This paper also shows that liquid zone controller reactivity worth is greatly decreased in presence of thorium-based fuels compared to the reference natural uranium fuel, since the fission and the fast to thermal scattering incremental cross sections are higher for the new fuels. (authors)« less

  17. A root zone modelling approach to estimating groundwater recharge from irrigated areas

    NASA Astrophysics Data System (ADS)

    Jiménez-Martínez, J.; Skaggs, T. H.; van Genuchten, M. Th.; Candela, L.

    2009-03-01

    SummaryIn irrigated semi-arid and arid regions, accurate knowledge of groundwater recharge is important for the sustainable management of scarce water resources. The Campo de Cartagena area of southeast Spain is a semi-arid region where irrigation return flow accounts for a substantial portion of recharge. In this study we estimated irrigation return flow using a root zone modelling approach in which irrigation, evapotranspiration, and soil moisture dynamics for specific crops and irrigation regimes were simulated with the HYDRUS-1D software package. The model was calibrated using field data collected in an experimental plot. Good agreement was achieved between the HYDRUS-1D simulations and field measurements made under melon and lettuce crops. The simulations indicated that water use by the crops was below potential levels despite regular irrigation. The fraction of applied water (irrigation plus precipitation) going to recharge ranged from 22% for a summer melon crop to 68% for a fall lettuce crop. In total, we estimate that irrigation of annual fruits and vegetables produces 26 hm 3 y -1 of groundwater recharge to the top unconfined aquifer. This estimate does not include important irrigated perennial crops in the region, such as artichoke and citrus. Overall, the results suggest a greater amount of irrigation return flow in the Campo de Cartagena region than was previously estimated.

  18. A Two-Zone Multigrid Model for SI Engine Combustion Simulation Using Detailed Chemistry

    DOE PAGES

    Ge, Hai-Wen; Juneja, Harmit; Shi, Yu; ...

    2010-01-01

    An efficient multigrid (MG) model was implemented for spark-ignited (SI) engine combustion modeling using detailed chemistry. The model is designed to be coupled with a level-set-G-equation model for flame propagation (GAMUT combustion model) for highly efficient engine simulation. The model was explored for a gasoline direct-injection SI engine with knocking combustion. The numerical results using the MG model were compared with the results of the original GAMUT combustion model. A simpler one-zone MG model was found to be unable to reproduce the results of the original GAMUT model. However, a two-zone MG model, which treats the burned and unburned regionsmore » separately, was found to provide much better accuracy and efficiency than the one-zone MG model. Without loss in accuracy, an order of magnitude speedup was achieved in terms of CPU and wall times. To reproduce the results of the original GAMUT combustion model, either a low searching level or a procedure to exclude high-temperature computational cells from the grouping should be applied to the unburned region, which was found to be more sensitive to the combustion model details.« less

  19. The Evolution of Software Pricing: From Box Licenses to Application Service Provider Models.

    ERIC Educational Resources Information Center

    Bontis, Nick; Chung, Honsan

    2000-01-01

    Describes three different pricing models for software. Findings of this case study support the proposition that software pricing is a complex and subjective process. The key determinant of alignment between vendor and user is the nature of value in the software to the buyer. This value proposition may range from increased cost reduction to…

  20. Modeling the migration of fluids in subduction zones

    NASA Astrophysics Data System (ADS)

    Spiegelman, M.; Wilson, C. R.; van Keken, P. E.; Hacker, B. R.

    2010-12-01

    Fluids play a major role in the formation of arc volcanism and the generation of continental crust. Progressive dehydration reactions in the downgoing slab release fluids to the hot overlying mantle wedge, causing flux melting and the migration of melts to the volcanic front. While the qualitative concept is well established the quantitative details of fluid release and especially that of fluid migration and generation of hydrous melting in the wedge is still poorly understood. Here we present new models of the fluid migration through the mantle wedge for subduction zones that span the spectrum of arcs worldwide. We focus on the flow of water and use an existing set of high resolution thermal and metamorphic models (van Keken et al., JGR, in review) to predict the regions of water release from the sediments, upper and lower crust, and upper most mantle. We use this water flux as input for the fluid migration calculation based on new finite element models built on advanced computational libraries (FEniCS/PETSc) for efficient and flexible solution of coupled multi-physics problems. The first generation of these models solves for the evolution of porosity and fluid-pressure/flux throughout the slab and wedge given solid flow, viscosity and thermal fields from the existing thermal models. Fluid flow in the new models depends on both permeability and the rheology of the slab-wedge system as interaction with rheological variability can induce additional pressure gradients that affect the fluid flow pathways. We will explore the sensitivity of fluid flow paths for a range of subduction zones and fluid flow parameters with emphasis on variability of the location of the volcanic arc with respect to flow paths and expected degrees of hydrous melting which can be estimated given a variety of wet-melting parameterizations (e.g. Katz et al, 2003, Kelley et al, 2010). The current models just include dehydration reactions but work continues on the next generation of models which

  1. Application of neural networks to software quality modeling of a very large telecommunications system.

    PubMed

    Khoshgoftaar, T M; Allen, E B; Hudepohl, J P; Aud, S J

    1997-01-01

    Society relies on telecommunications to such an extent that telecommunications software must have high reliability. Enhanced measurement for early risk assessment of latent defects (EMERALD) is a joint project of Nortel and Bell Canada for improving the reliability of telecommunications software products. This paper reports a case study of neural-network modeling techniques developed for the EMERALD system. The resulting neural network is currently in the prototype testing phase at Nortel. Neural-network models can be used to identify fault-prone modules for extra attention early in development, and thus reduce the risk of operational problems with those modules. We modeled a subset of modules representing over seven million lines of code from a very large telecommunications software system. The set consisted of those modules reused with changes from the previous release. The dependent variable was membership in the class of fault-prone modules. The independent variables were principal components of nine measures of software design attributes. We compared the neural-network model with a nonparametric discriminant model and found the neural-network model had better predictive accuracy.

  2. Process-based modeling of temperature and water profiles in the seedling recruitment zone: Part I. Model validation

    USDA-ARS?s Scientific Manuscript database

    Process-based modeling provides detailed spatial and temporal information of the soil environment in the shallow seedling recruitment zone across field topography where measurements of soil temperature and water may not sufficiently describe the zone. Hourly temperature and water profiles within the...

  3. Avoidable Software Procurements

    DTIC Science & Technology

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  4. Abrupt Upper-Plate Tilting Upon Slab-Transition-Zone Collision

    NASA Astrophysics Data System (ADS)

    Crameri, F.; Lithgow-Bertelloni, C. R.

    2017-12-01

    During its sinking, the remnant of a surface plate crosses and interacts with multiple boundaries in Earth's interior. The most-prominent dynamic interaction arises at the upper-mantle transition zone where the sinking plate is strongly affected by the higher-viscosity lower mantle. Within our numerical model, we unravel, for the first time, that this very collision of the sinking slab with the transition zone induces a sudden, dramatic downward tilt of the upper plate towards the subduction trench. The slab-transition zone collision sets parts of the higher-viscosity lower mantle in motion. Naturally, this then induces an overall larger return flow cell that, at its onset, tilts the upper plate abruptly by around 0.05 degrees and over around 10 Millions of years. Such a significant and abrupt variation in surface topography should be clearly visible in temporal geologic records of large-scale surface elevation and might explain continental-wide tilting as observed in Australia since the Eocene or North America during the Phanerozoic. Unravelling this crucial mantle-lithosphere interaction was possible thanks to state-of-the-art numerical modelling (powered by StagYY; Tackley 2008, PEPI) and post-processing (powered by StagLab; www.fabiocrameri.ch/software). The new model that is introduced here to study the dynamically self-consistent temporal evolution of subduction features accurate subduction-zone topography, robust single-sided plate sinking, stronger plates close to laboratory values, an upper-mantle phase transition and, crucially, simple continents at a free surface. A novel, fully-automated post-processing includes physical model diagnostics like slab geometry, mantle flow pattern, upper-plate tilt angle and trench location.

  5. Vegetation root zone storage and rooting depth, derived from local calibration of a global hydrological model

    NASA Astrophysics Data System (ADS)

    van der Ent, R.; Van Beek, R.; Sutanudjaja, E.; Wang-Erlandsson, L.; Hessels, T.; Bastiaanssen, W.; Bierkens, M. F.

    2017-12-01

    The storage and dynamics of water in the root zone control many important hydrological processes such as saturation excess overland flow, interflow, recharge, capillary rise, soil evaporation and transpiration. These processes are parameterized in hydrological models or land-surface schemes and the effect on runoff prediction can be large. Root zone parameters in global hydrological models are very uncertain as they cannot be measured directly at the scale on which these models operate. In this paper we calibrate the global hydrological model PCR-GLOBWB using a state-of-the-art ensemble of evaporation fields derived by solving the energy balance for satellite observations. We focus our calibration on the root zone parameters of PCR-GLOBWB and derive spatial patterns of maximum root zone storage. We find these patterns to correspond well with previous research. The parameterization of our model allows for the conversion of maximum root zone storage to root zone depth and we find that these correspond quite well to the point observations where available. We conclude that climate and soil type should be taken into account when regionalizing measured root depth for a certain vegetation type. We equally find that using evaporation rather than discharge better allows for local adjustment of root zone parameters within a basin and thus provides orthogonal data to diagnose and optimize hydrological models and land surface schemes.

  6. Vegetation root zone storage and rooting depth, derived from local calibration of a global hydrological model

    NASA Astrophysics Data System (ADS)

    van der Ent, Ruud; van Beek, Rens; Sutanudjaja, Edwin; Wang-Erlandsson, Lan; Hessels, Tim; Bastiaanssen, Wim; Bierkens, Marc

    2017-04-01

    The storage and dynamics of water in the root zone control many important hydrological processes such as saturation excess overland flow, interflow, recharge, capillary rise, soil evaporation and transpiration. These processes are parameterized in hydrological models or land-surface schemes and the effect on runoff prediction can be large. For root zone parameters in global hydrological models are very uncertain as they cannot be measured directly at the scale on which these models operate. In this paper we calibrate the global hydrological model PCR-GLOBWB using a state-of-the-art ensemble of evaporation fields derived by solving the energy balance for satellite observations. We focus our calibration on the root zone parameters of PCR-GLOBWB and derive spatial patterns of maximum root zone storage. We find these patterns to correspond well with previous research. The parameterization of our model allows for the conversion of maximum root zone storage to root zone depth and we find that these correspond quite well to the point observations where available. We conclude that climate and soil type should be taken into account when regionalizing measured root depth for a certain vegetation type. We equally find that using evaporation rather than discharge better allows for local adjustment of root zone parameters within a basin and thus provides orthogonal data to diagnose and optimize hydrological models and land surface schemes.

  7. Theoretical and software considerations for general dynamic analysis using multilevel substructured models

    NASA Technical Reports Server (NTRS)

    Schmidt, R. J.; Dodds, R. H., Jr.

    1985-01-01

    The dynamic analysis of complex structural systems using the finite element method and multilevel substructured models is presented. The fixed-interface method is selected for substructure reduction because of its efficiency, accuracy, and adaptability to restart and reanalysis. This method is extended to reduction of substructures which are themselves composed of reduced substructures. The implementation and performance of the method in a general purpose software system is emphasized. Solution algorithms consistent with the chosen data structures are presented. It is demonstrated that successful finite element software requires the use of software executives to supplement the algorithmic language. The complexity of the implementation of restart and reanalysis porcedures illustrates the need for executive systems to support the noncomputational aspects of the software. It is shown that significant computational efficiencies can be achieved through proper use of substructuring and reduction technbiques without sacrificing solution accuracy. The restart and reanalysis capabilities and the flexible procedures for multilevel substructured modeling gives economical yet accurate analyses of complex structural systems.

  8. 1D minimum p-velocity model of the Kamchatka subducting zone

    NASA Astrophysics Data System (ADS)

    Nizkous, I.; Sanina, I.; Gontovaya, L.

    2003-04-01

    Kamchatka peninsula is a very active seismic zone. The old Pacific plate subducts below the North American Plate and this causes high seismic and volcanic activity in this region. The extensive Kamchatka Regional Seismic Network (KRSN) has operated since 1962 and registers around 600 earthquakes per year. This provides a large number of high quality seismic data. In this work we are investigate P-velocity structure of the Kamchatka peninsula and subducting zone in Western Pacific. This region is well studied, but we would like to try a little bit different approach. We would like to present 1D minimum P-velocity model of the Kamchatka region created using VELEST program [3]. Data set based on 84 well-located earthquakes (IP, EP, IS and ES phases) recorded by KRSN in 1998 and in 1999. As the initial model Kuzin's model have been taken [1]. But in our calculations we split model into 17 layers instead of initial 5. Maximal investigated depth is 120 km. Using VELEST simultaneous mode we solve coupled hypocenter-velocity model problem for local earthquakes. In this case it is very important to utilize well locatable events for the sake of minimizing a priori added uncertainties. And this is major point of the approach. We apply this idea and the result is looks like the result obtained by A. Gorbatov et. al. [2] Using this 1D minimum model we redefine earthquakes hypocenter parameters and recalculate p-wave travel time residuals. This work is the first step in 3D modeling of the Kamchatka subducting zone. References: 1. I.P Kuzin. 'Focal zone and upper mantle structure of the East Kamchatka region', Moscow, Nauka, 1974. 2. A. Gorbatov, J. Domingues, G.Suarez, V.kostoglodov, D.Zhao, and E. Gordeev, 'Tomographic imaging of the P-wave velocity structure beneath the Kamchatka peninsula', Geophys. J. Int, 1999, 137, 269-279. 3. Kissling, E., W.L. Ellsworth, D. Eberhart-Phillips, and U. Kradolfer: Initial reference models in local earthquake tomography, J. Geophys. Res., 99

  9. The Cascadia Subduction Zone: two contrasting models of lithospheric structure

    USGS Publications Warehouse

    Romanyuk, T.V.; Blakely, R.; Mooney, W.D.

    1998-01-01

    The Pacific margin of North America is one of the most complicated regions in the world in terms of its structure and present day geodynamic regime. The aim of this work is to develop a better understanding of lithospheric structure of the Pacific Northwest, in particular the Cascadia subduction zone of Southwest Canada and Northwest USA. The goal is to compare and contrast the lithospheric density structure along two profiles across the subduction zone and to interpet the differences in terms of active processes. The subduction of the Juan de Fuca plate beneath North America changes markedly along the length of the subduction zone, notably in the angle of subduction, distribution of earthquakes and volcanism, goelogic and seismic structure of the upper plate, and regional horizontal stress. To investigate these characteristics, we conducted detailed density modeling of the crust and mantle along two transects across the Cascadia subduction zone. One crosses Vancouver Island and the Canadian margin, the other crosses the margin of central Oregon.

  10. Entrepreneurial model based technology creative industries sector software through the use of free open source software for Universitas Pendidikan Indonesia students

    NASA Astrophysics Data System (ADS)

    Hasan, B.; Hasbullah; Purnama, W.; Hery, A.

    2016-04-01

    Creative industry development areas of software by using Free Open Source Software (FOSS) is expected to be one of the solutions to foster new entrepreneurs of the students who can open job opportunities and contribute to economic development in Indonesia. This study aims to create entrepreneurial coaching model based on the creative industries by utilizing FOSS software field as well as provide understanding and fostering entrepreneurial creative industries based field software for students of Universitas Pendidikan Indonesia. This activity phase begins with identifying entrepreneurs or business software technology that will be developed, training and mentoring, apprenticeship process at industrial partners, creation of business plans and monitoring and evaluation. This activity involves 30 UPI student which has the motivation to self-employment and have competence in the field of information technology. The results and outcomes expected from these activities is the birth of a number of new entrepreneurs from the students engaged in the software industry both software in the world of commerce (e-commerce) and education/learning (e-learning/LMS) and games.

  11. Advances in Games Technology: Software, Models, and Intelligence

    ERIC Educational Resources Information Center

    Prakash, Edmond; Brindle, Geoff; Jones, Kevin; Zhou, Suiping; Chaudhari, Narendra S.; Wong, Kok-Wai

    2009-01-01

    Games technology has undergone tremendous development. In this article, the authors report the rapid advancement that has been observed in the way games software is being developed, as well as in the development of games content using game engines. One area that has gained special attention is modeling the game environment such as terrain and…

  12. A systematic literature review of open source software quality assessment models.

    PubMed

    Adewumi, Adewole; Misra, Sanjay; Omoregbe, Nicholas; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    Many open source software (OSS) quality assessment models are proposed and available in the literature. However, there is little or no adoption of these models in practice. In order to guide the formulation of newer models so they can be acceptable by practitioners, there is need for clear discrimination of the existing models based on their specific properties. Based on this, the aim of this study is to perform a systematic literature review to investigate the properties of the existing OSS quality assessment models by classifying them with respect to their quality characteristics, the methodology they use for assessment, and their domain of application so as to guide the formulation and development of newer models. Searches in IEEE Xplore, ACM, Science Direct, Springer and Google Search is performed so as to retrieve all relevant primary studies in this regard. Journal and conference papers between the year 2003 and 2015 were considered since the first known OSS quality model emerged in 2003. A total of 19 OSS quality assessment model papers were selected. To select these models we have developed assessment criteria to evaluate the quality of the existing studies. Quality assessment models are classified into five categories based on the quality characteristics they possess namely: single-attribute, rounded category, community-only attribute, non-community attribute as well as the non-quality in use models. Our study reflects that software selection based on hierarchical structures is found to be the most popular selection method in the existing OSS quality assessment models. Furthermore, we found that majority (47%) of the existing models do not specify any domain of application. In conclusion, our study will be a valuable contribution to the community and helps the quality assessment model developers in formulating newer models and also to the practitioners (software evaluators) in selecting suitable OSS in the midst of alternatives.

  13. Agent-Based Modeling and Simulation in the Dilemma Zone

    DOT National Transportation Integrated Search

    2015-12-01

    The goal of this study is to develop a realistic dilemma zone (DZ) model that considers the effects of factors surrounding vehicles at an intersection, particularly focusing on driver decision-making behavior, such as the presence of a pedestrian cou...

  14. Using software metrics and software reliability models to attain acceptable quality software for flight and ground support software for avionic systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Stella

    1992-01-01

    This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.

  15. Alternative Zoning Scenarios for Regional Sustainable Land Use Controls in China: A Knowledge-Based Multiobjective Optimisation Model

    PubMed Central

    Xia, Yin; Liu, Dianfeng; Liu, Yaolin; He, Jianhua; Hong, Xiaofeng

    2014-01-01

    Alternative land use zoning scenarios provide guidance for sustainable land use controls. This study focused on an ecologically vulnerable catchment on the Loess Plateau in China, proposed a novel land use zoning model, and generated alternative zoning solutions to satisfy the various requirements of land use stakeholders and managers. This model combined multiple zoning objectives, i.e., maximum zoning suitability, maximum planning compatibility and maximum spatial compactness, with land use constraints by using goal programming technique, and employed a modified simulated annealing algorithm to search for the optimal zoning solutions. The land use zoning knowledge was incorporated into the initialisation operator and neighbourhood selection strategy of the simulated annealing algorithm to improve its efficiency. The case study indicates that the model is both effective and robust. Five optimal zoning scenarios of the study area were helpful for satisfying the requirements of land use controls in loess hilly regions, e.g., land use intensification, agricultural protection and environmental conservation. PMID:25170679

  16. A testing-coverage software reliability model considering fault removal efficiency and error generation.

    PubMed

    Li, Qiuying; Pham, Hoang

    2017-01-01

    In this paper, we propose a software reliability model that considers not only error generation but also fault removal efficiency combined with testing coverage information based on a nonhomogeneous Poisson process (NHPP). During the past four decades, many software reliability growth models (SRGMs) based on NHPP have been proposed to estimate the software reliability measures, most of which have the same following agreements: 1) it is a common phenomenon that during the testing phase, the fault detection rate always changes; 2) as a result of imperfect debugging, fault removal has been related to a fault re-introduction rate. But there are few SRGMs in the literature that differentiate between fault detection and fault removal, i.e. they seldom consider the imperfect fault removal efficiency. But in practical software developing process, fault removal efficiency cannot always be perfect, i.e. the failures detected might not be removed completely and the original faults might still exist and new faults might be introduced meanwhile, which is referred to as imperfect debugging phenomenon. In this study, a model aiming to incorporate fault introduction rate, fault removal efficiency and testing coverage into software reliability evaluation is developed, using testing coverage to express the fault detection rate and using fault removal efficiency to consider the fault repair. We compare the performance of the proposed model with several existing NHPP SRGMs using three sets of real failure data based on five criteria. The results exhibit that the model can give a better fitting and predictive performance.

  17. A testing-coverage software reliability model considering fault removal efficiency and error generation

    PubMed Central

    Li, Qiuying; Pham, Hoang

    2017-01-01

    In this paper, we propose a software reliability model that considers not only error generation but also fault removal efficiency combined with testing coverage information based on a nonhomogeneous Poisson process (NHPP). During the past four decades, many software reliability growth models (SRGMs) based on NHPP have been proposed to estimate the software reliability measures, most of which have the same following agreements: 1) it is a common phenomenon that during the testing phase, the fault detection rate always changes; 2) as a result of imperfect debugging, fault removal has been related to a fault re-introduction rate. But there are few SRGMs in the literature that differentiate between fault detection and fault removal, i.e. they seldom consider the imperfect fault removal efficiency. But in practical software developing process, fault removal efficiency cannot always be perfect, i.e. the failures detected might not be removed completely and the original faults might still exist and new faults might be introduced meanwhile, which is referred to as imperfect debugging phenomenon. In this study, a model aiming to incorporate fault introduction rate, fault removal efficiency and testing coverage into software reliability evaluation is developed, using testing coverage to express the fault detection rate and using fault removal efficiency to consider the fault repair. We compare the performance of the proposed model with several existing NHPP SRGMs using three sets of real failure data based on five criteria. The results exhibit that the model can give a better fitting and predictive performance. PMID:28750091

  18. Active Learning through Modeling: Introduction to Software Development in the Business Curriculum

    ERIC Educational Resources Information Center

    Roussev, Boris; Rousseva, Yvonna

    2004-01-01

    Modern software practices call for the active involvement of business people in the software process. Therefore, programming has become an indispensable part of the information systems component of the core curriculum at business schools. In this paper, we present a model-based approach to teaching introduction to programming to general business…

  19. Software Smarts

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Under an SBIR (Small Business Innovative Research) contract with Johnson Space Center, Knowledge Based Systems Inc. (KBSI) developed an intelligent software environment for modeling and analyzing mission planning activities, simulating behavior, and, using a unique constraint propagation mechanism, updating plans with each change in mission planning activities. KBSI developed this technology into a commercial product, PROJECTLINK, a two-way bridge between PROSIm, KBSI's process modeling and simulation software and leading project management software like Microsoft Project and Primavera's SureTrak Project Manager.

  20. Stress concentration on Intraplate Seismicity: Numerical Modeling of Slab-released Fluids in the New Madrid Seismic Zone

    NASA Astrophysics Data System (ADS)

    Saxena, A.; Choi, E.; Powell, C. A.

    2017-12-01

    The mechanism behind the seismicity of the New Madrid Seismic Zone (NMSZ), the major intraplate earthquake source in the Central and Eastern US (CEUS), is still debated but new insights are being provided by recent tomographic studies involving USArray. A high-resolution tomography study by Nyamwandha et al. (2016) in the NMSZ indicates the presence of low (3 % - 5 %) upper mantle Vp and Vs anomalies in the depth range 100 to 250 km. The elevated anomaly magnitudes are difficult to explain by temperature alone. As the low-velocity anomalies beneath the northeast China are attributed to fluids released from the stagnant Pacific slab, water released from the stagnant Laramide Slab, presently located at transition zone depths beneath the CEUS might be contributing to the low velocity features in this region's upper mantle. Here, we investigate the potential impact of the slab-released fluids on the stresses at seismogenic depths using numerical modeling. We convert the tomographic results into temperature field under various assumed values of spatially uniform water content. In more realistic cases, water content is added only when the converted temperature exceeds the melting temperature of olivine. Viscosities are then computed based on the temperature and water content and given to our geodynamic models created by Pylith, an open source software for crustal dynamics. The model results show that increasing water content weakens the upper mantle more than temperature alone and thus elevates the differential stress in the upper crust. These results can better explain the tomography results and seismicity without invoking melting. We also invert the tomography results for volume fraction of orthopyroxene and temperature and compare the resultant stresses with those for pure olivine. To enhance the reproducibility, selected models in this study will be made available in the form of sharable and reproducible packages enabled by EarthCube Building block project, GeoTrust.

  1. European Regional Climate Zone Modeling of a Commercial Absorption Heat Pump Hot Water Heater

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Vishaldeep; Shen, Bo; Keinath, Chris

    2017-01-01

    High efficiency gas-burning hot water heating takes advantage of a condensing heat exchanger to deliver improved combustion efficiency over a standard non-condensing configuration. The water heating is always lower than the gas heating value. In contrast, Gas Absorption Heat Pump (GAHP) hot water heating combines the efficiency of gas burning with the performance increase from a heat pump to offer significant gas energy savings. An ammonia-water system also has the advantage of zero Ozone Depletion Potential and low Global Warming Potential. In comparison with air source electric heat pumps, the absorption system can maintain higher coefficients of performance in coldermore » climates. In this work, a GAHP commercial water heating system was compared to a condensing gas storage system for a range of locations and climate zones across Europe. The thermodynamic performance map of a single effect ammonia-water absorption system was used in a building energy modeling software that could also incorporate the changing ambient air temperature and water mains temperature for a specific location, as well as a full-service restaurant water draw pattern.« less

  2. Empirical Models of Zones Protecting Against Coal Dust Explosion

    NASA Astrophysics Data System (ADS)

    Prostański, Dariusz

    2017-09-01

    The paper presents predicted use of research' results to specify relations between volume of dust deposition and changes of its concentration in air. These were used to shape zones protecting against coal dust explosion. Methodology of research was presented, including methods of measurement of dust concentration as well as deposition. Measurements were taken in the Brzeszcze Mine within framework of MEZAP, co-financed by The National Centre for Research and Development (NCBR) and performed by the Institute of Mining Technology KOMAG, the Central Mining Institute (GIG) and the Coal Company PLC. The project enables performing of research related to measurements of volume of dust deposition as well as its concentration in air in protective zones in a number of mine workings in the Brzeszcze Mine. Developed model may be supportive tool in form of system located directly in protective zones or as operator tool warning about increasing hazard of coal dust explosion.

  3. Software Engineering Guidebook

    NASA Technical Reports Server (NTRS)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  4. Model Based Analysis and Test Generation for Flight Software

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  5. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  6. Characterizing and Modeling the Cost of Rework in a Library of Reusable Software Components

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Condon, Steven E.; ElEmam, Khaled; Hendrick, Robert B.; Melo, Walcelio

    1997-01-01

    In this paper we characterize and model the cost of rework in a Component Factory (CF) organization. A CF is responsible for developing and packaging reusable software components. Data was collected on corrective maintenance activities for the Generalized Support Software reuse asset library located at the Flight Dynamics Division of NASA's GSFC. We then constructed a predictive model of the cost of rework using the C4.5 system for generating a logical classification model. The predictor variables for the model are measures of internal software product attributes. The model demonstrates good prediction accuracy, and can be used by managers to allocate resources for corrective maintenance activities. Furthermore, we used the model to generate proscriptive coding guidelines to improve programming, practices so that the cost of rework can be reduced in the future. The general approach we have used is applicable to other environments.

  7. Prowess - A Software Model for the Ooty Wide Field Array

    NASA Astrophysics Data System (ADS)

    Marthi, Visweshwar Ram

    2017-03-01

    One of the scientific objectives of the Ooty Wide Field Array (OWFA) is to observe the redshifted H i emission from z ˜ 3.35. Although predictions spell out optimistic outcomes in reasonable integration times, these studies were based purely on analytical assumptions, without accounting for limiting systematics. A software model for OWFA has been developed with a view to understanding the instrument-induced systematics, by describing a complete software model for the instrument. This model has been implemented through a suite of programs, together called Prowess, which has been conceived with the dual role of an emulator as well as observatory data analysis software. The programming philosophy followed in building Prowess enables a general user to define an own set of functions and add new functionality. This paper describes a co-ordinate system suitable for OWFA in which the baselines are defined. The foregrounds are simulated from their angular power spectra. The visibilities are then computed from the foregrounds. These visibilities are then used for further processing, such as calibration and power spectrum estimation. The package allows for rich visualization features in multiple output formats in an interactive fashion, giving the user an intuitive feel for the data. Prowess has been extensively used for numerical predictions of the foregrounds for the OWFA H i experiment.

  8. Built To Last: Using Iterative Development Models for Sustainable Scientific Software Development

    NASA Astrophysics Data System (ADS)

    Jasiak, M. E.; Truslove, I.; Savoie, M.

    2013-12-01

    In scientific research, software development exists fundamentally for the results they create. The core research must take focus. It seems natural to researchers, driven by grant deadlines, that every dollar invested in software development should be used to push the boundaries of problem solving. This system of values is frequently misaligned with those of the software being created in a sustainable fashion; short-term optimizations create longer-term sustainability issues. The National Snow and Ice Data Center (NSIDC) has taken bold cultural steps in using agile and lean development and management methodologies to help its researchers meet critical deadlines, while building in the necessary support structure for the code to live far beyond its original milestones. Agile and lean software development and methodologies including Scrum, Kanban, Continuous Delivery and Test-Driven Development have seen widespread adoption within NSIDC. This focus on development methods is combined with an emphasis on explaining to researchers why these methods produce more desirable results for everyone, as well as promoting developers interacting with researchers. This presentation will describe NSIDC's current scientific software development model, how this addresses the short-term versus sustainability dichotomy, the lessons learned and successes realized by transitioning to this agile and lean-influenced model, and the current challenges faced by the organization.

  9. Models for Deploying Open Source and Commercial Software to Support Earth Science Data Processing and Distribution

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Downs, R. R.

    2011-12-01

    Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may

  10. Safe Zone of Posterior Screw Insertion for Talar Neck Fractures on 3-Dimensional Reconstruction Model.

    PubMed

    Wu, Jian-Qun; Ma, Sheng-Hui; Liu, Song; Qin, Cheng-He; Jin, Dan; Yu, Bin

    2017-02-01

    To investigate the optimal posterior screw placement and the geometry of safe zones for screw insertion in the talar neck. Computed tomography data for 15 normal feet were imported into Mimics 10.01 software for 3-dimensional reconstruction; 4.0-mm-diameter screws were simulated from the lateral tubercle of the posterior process of the talus to the talar head. The range of screw paths trajectories and screw lengths at nine locations that did not breach the cortex of the talus were evaluated. In addition, the farthest (point a) and nearest point (point b) of the safe zone to the subtalar joint at each location, the anteversion angle (angle A), which is parallel to the sagittal plane, and the horizontal angle (angle B), which is perpendicular to the sagittal plane, were measured. The safe zone was mainly between the 30% location and the 60% location; the width of each safe zone was 13.6° ± 1.4°; the maximum height of each safe zone was 7.8° ± 1.2°. The height of the safe zone was lowest at the 30% location (4.5°) and highest at the 50% location (7.3°). The mixed safe zone of all tali was between the 50% location and the 60% location. When a screw was inserted at point a, the safe entry distance (screw length) ranged from 48.8 to 49.5 mm, and when inserted to point b, the distance ranged from 48.2 to 48.9 mm. And inserting a 48.7 mm screw, 5.6° laterally and 7.4° superiorly, from the lateral tubercle of the posterior process of the talus towards the talar head is safest. The safe zone of posterior screw fixation have been defined applying to most talus, assuming the fractures are well reduced, this may strengthen the stability, shorten the operation time and reduce the incidence of surgical complications. © 2017 Chinese Orthopaedic Association and John Wiley & Sons Australia, Ltd.

  11. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  12. Integration of bio- and geoscience data with the ODM2 standards and software ecosystem for the CZOData and BiG CZ Data projects

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Mayorga, E.; Horsburgh, J. S.; Lehnert, K. A.; Zaslavsky, I.

    2015-12-01

    We have developed a family of solutions to the challenges of integrating diverse data from of biological and geological (BiG) disciplines for Critical Zone (CZ) science. These standards and software solutions have been developed around the new Observations Data Model version 2.0 (ODM2, http://ODM2.org), which was designed as a profile of the Open Geospatial Consortium's (OGC) Observations and Measurements (O&M) standard. The ODM2 standards and software ecosystem has at it's core an information model that balances specificity with flexibility to powerfully and equally serve the needs of multiple dataset types, from multivariate sensor-generated time series to geochemical measurements of specimen hierarchies to multi-dimensional spectral data to biodiversity observations. ODM2 has been adopted as the information model guiding the next generation of cyberinfrastructure development for the Interdisciplinary Earth Data Alliance (http://www.iedadata.org/) and the CUAHSI Water Data Center (https://www.cuahsi.org/wdc). Here we present several components of the ODM2 standards and software ecosystem that were developed specifically to help CZ scientists and their data managers to share and manage data through the national Critical Zone Observatory data integration project (CZOData, http://criticalzone.org/national/data/) and the bio integration with geo for critical zone science data project (BiG CZ Data, http://bigcz.org/). These include the ODM2 Controlled Vocabulary system (http://vocabulary.odm2.org), the YAML Observation Data Archive & exchange (YODA) File Format (https://github.com/ODM2/YODA-File) and the BiG CZ Toolbox, which will combine easy-to-install ODM2 databases (https://github.com/ODM2/ODM2) with a variety of graphical software packages for data management such as ODMTools (https://github.com/ODM2/ODMToolsPython) and the ODM2 Streaming Data Loader (https://github.com/ODM2/ODM2StreamingDataLoader).

  13. A Reference Model for Software and System Inspections. White Paper

    NASA Technical Reports Server (NTRS)

    He, Lulu; Shull, Forrest

    2009-01-01

    Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.

  14. Model-Driven Development for PDS4 Software and Services

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D. J.; Algermissen, S. S.; Cayanan, M. D.; Joyner, R. S.; Hardman, S. H.; Padams, J. H.

    2018-04-01

    PDS4 data product labels provide the information necessary for processing the referenced digital object. However, significantly more information is available in the PDS4 Information Model. This additional information is made available for use, by both software and services, to configure, promote resiliency, and improve interoperability.

  15. Accuracy of open-source software segmentation and paper-based printed three-dimensional models.

    PubMed

    Szymor, Piotr; Kozakiewicz, Marcin; Olszewski, Raphael

    2016-02-01

    In this study, we aimed to verify the accuracy of models created with the help of open-source Slicer 3.6.3 software (Surgical Planning Lab, Harvard Medical School, Harvard University, Boston, MA, USA) and the Mcor Matrix 300 paper-based 3D printer. Our study focused on the accuracy of recreating the walls of the right orbit of a cadaveric skull. Cone beam computed tomography (CBCT) of the skull was performed (0.25-mm pixel size, 0.5-mm slice thickness). Acquired DICOM data were imported into Slicer 3.6.3 software, where segmentation was performed. A virtual model was created and saved as an .STL file and imported into Netfabb Studio professional 4.9.5 software. Three different virtual models were created by cutting the original file along three different planes (coronal, sagittal, and axial). All models were printed with a Selective Deposition Lamination Technology Matrix 300 3D printer using 80 gsm A4 paper. The models were printed so that their cutting plane was parallel to the paper sheets creating the model. Each model (coronal, sagittal, and axial) consisted of three separate parts (∼200 sheets of paper each) that were glued together to form a final model. The skull and created models were scanned with a three-dimensional (3D) optical scanner (Breuckmann smart SCAN) and were saved as .STL files. Comparisons of the orbital walls of the skull, the virtual model, and each of the three paper models were carried out with GOM Inspect 7.5SR1 software. Deviations measured between the models analysed were presented in the form of a colour-labelled map and covered with an evenly distributed network of points automatically generated by the software. An average of 804.43 ± 19.39 points for each measurement was created. Differences measured in each point were exported as a .csv file. The results were statistically analysed using Statistica 10, with statistical significance set at p < 0.05. The average number of points created on models for each measurement was 804

  16. ASTEC and MODEL: Controls software development at Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Surber, Jeffrey L.

    1993-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at the Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. In the last three years the ASTEC (Analysis and Simulation Tools for Engineering Controls) software has been under development. ASTEC is meant to be an integrated collection of controls analysis tools for use at the desktop level. MODEL (Multi-Optimal Differential Equation Language) is a translator that converts programs written in the MODEL language to FORTRAN. An upgraded version of the MODEL program will be merged into ASTEC. MODEL has not been modified since 1981 and has not kept with changes in computers or user interface techniques. This paper describes the changes made to MODEL in order to make it useful in the 90's and how it relates to ASTEC.

  17. Evaluation of the finite element software ABAQUS for biomechanical modelling of biphasic tissues.

    PubMed

    Wu, J Z; Herzog, W; Epstein, M

    1998-02-01

    The biphasic cartilage model proposed by Mow et al. (1980) has proven successful to capture the essential mechanical features of articular cartilage. In order to analyse the joint contact mechanics in real, anatomical joints, the cartilage model needs to be implemented into a suitable finite element code to approximate the irregular surface geometries of such joints. However, systematic and extensive evaluation of the capacity of commercial software for modelling the contact mechanics with biphasic cartilage layers has not been made. This research was aimed at evaluating the commercial finite element software ABAQUS for analysing biphasic soft tissues. The solutions obtained using ABAQUS were compared with those obtained using other finite element models and analytical solutions for three numerical tests: an unconfined indentation test, a test with the contact of a spherical cartilage surface with a rigid plate, and an axi-symmetric joint contact test. It was concluded that the biphasic cartilage model can be implemented into the commercial finite element software ABAQUS to analyse practical joint contact problems with biphasic articular cartilage layers.

  18. BioSTAR, a New Biomass and Yield Modeling Software

    NASA Astrophysics Data System (ADS)

    Kappas, M.; Degener, J.; Bauboeck, R.

    2013-12-01

    BioSTAR (Biomass Simulation Tool for Agricultural Recourses) is a new crop model which has been developed at the University of Göttingen for the assessment of agricultural biomass potentials in Lower Saxony, Germany. Lower Saxony is a major agricultural producer in Germany and in the EU, and biogas facilities which either use agricultural crops or manure or both have seen a strong boom in the last decade. To be able to model the potentials of these agricultural bioenergy crops was the objective of developing the BioSTAR model. BioSTAR is kept simple enough to be usable even for non-scientific users, e.g. staff in planning offices or farmers. The software of the model is written in Java and uses a Microsoft Access database connection to read its input data and write its output data. In this sense the software architecture is something entirely new as far as existing crop models are concerned. The database connection enables very fast editing of the various data sources which are needed to run a crop simulation and fosters the organization of this data. Due to the software setup, the amount of individual sites which can be processed with a few clicks is only limited by the maximum size of an Access database (2 GB) and thus allows datasets of 105 sites or more to be stored and processed. Data can easily be copied or imported from Excel. Capabilities of the crop model are: simulation of single or multiple year crop growth with total biomass production, evapotranspiration, soil water budget of a 16 layered soil profile and, nitrogen budget. The original growth engine of the model was carbon based (Azam-Ali, et al., 1994), but a radiation use efficiency and two transpiration based growth engines were added at a later point. Before each simulation run, the user can choose between these four growth engines and four different ET0-methods, or use an ensemble of them. Up to date (07/2013), the model has been calibrated for several winter and spring cereals, canola, maize

  19. Automated Environment Generation for Software Model Checking

    NASA Technical Reports Server (NTRS)

    Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.

    2003-01-01

    A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.

  20. SENSITIVE PARAMETER EVALUATION FOR A VADOSE ZONE FATE AND TRANSPORT MODEL

    EPA Science Inventory

    This report presents information pertaining to quantitative evaluation of the potential impact of selected parameters on output of vadose zone transport and fate models used to describe the behavior of hazardous chemicals in soil. The Vadose 2one Interactive Processes (VIP) model...

  1. Cascadia Subduction Zone

    USGS Publications Warehouse

    Frankel, Arthur D.; Petersen, Mark D.

    2008-01-01

    The geometry and recurrence times of large earthquakes associated with the Cascadia Subduction Zone (CSZ) were discussed and debated at a March 28-29, 2006 Pacific Northwest workshop for the USGS National Seismic Hazard Maps. The CSZ is modeled from Cape Mendocino in California to Vancouver Island in British Columbia. We include the same geometry and weighting scheme as was used in the 2002 model (Frankel and others, 2002) based on thermal constraints (Fig. 1; Fluck and others, 1997 and a reexamination by Wang et al., 2003, Fig. 11, eastern edge of intermediate shading). This scheme includes four possibilities for the lower (eastern) limit of seismic rupture: the base of elastic zone (weight 0.1), the base of transition zone (weight 0.2), the midpoint of the transition zone (weight 0.2), and a model with a long north-south segment at 123.8? W in the southern and central portions of the CSZ, with a dogleg to the northwest in the northern portion of the zone (weight 0.5). The latter model was derived from the approximate average longitude of the contour of the 30 km depth of the CSZ as modeled by Fluck et al. (1997). A global study of the maximum depth of thrust earthquakes on subduction zones by Tichelaar and Ruff (1993) indicated maximum depths of about 40 km for most of the subduction zones studied, although the Mexican subduction zone had a maximum depth of about 25 km (R. LaForge, pers. comm., 2006). The recent inversion of GPS data by McCaffrey et al. (2007) shows a significant amount of coupling (a coupling factor of 0.2-0.3) as far east as 123.8? West in some portions of the CSZ. Both of these lines of evidence lend support to the model with a north-south segment at 123.8? W.

  2. Development of Software to Model AXAF-I Image Quality

    NASA Technical Reports Server (NTRS)

    Ahmad, Anees; Hawkins, Lamar

    1996-01-01

    This draft final report describes the work performed under the delivery order number 145 from May 1995 through August 1996. The scope of work included a number of software development tasks for the performance modeling of AXAF-I. A number of new capabilities and functions have been added to the GT software, which is the command mode version of the GRAZTRACE software, originally developed by MSFC. A structural data interface has been developed for the EAL (old SPAR) finite element analysis FEA program, which is being used by MSFC Structural Analysis group for the analysis of AXAF-I. This interface utility can read the structural deformation file from the EAL and other finite element analysis programs such as NASTRAN and COSMOS/M, and convert the data to a suitable format that can be used for the deformation ray-tracing to predict the image quality for a distorted mirror. There is a provision in this utility to expand the data from finite element models assuming 180 degrees symmetry. This utility has been used to predict image characteristics for the AXAF-I HRMA, when subjected to gravity effects in the horizontal x-ray ground test configuration. The development of the metrology data processing interface software has also been completed. It can read the HDOS FITS format surface map files, manipulate and filter the metrology data, and produce a deformation file, which can be used by GT for ray tracing for the mirror surface figure errors. This utility has been used to determine the optimum alignment (axial spacing and clocking) for the four pairs of AXAF-I mirrors. Based on this optimized alignment, the geometric images and effective focal lengths for the as built mirrors were predicted to cross check the results obtained by Kodak.

  3. A Team Building Model for Software Engineering Courses Term Projects

    ERIC Educational Resources Information Center

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  4. Model-It: A Case Study of Learner-Centered Software Design for Supporting Model Building.

    ERIC Educational Resources Information Center

    Jackson, Shari L.; Stratford, Steven J.; Krajcik, Joseph S.; Soloway, Elliot

    Learner-centered software design (LCSD) guides the design of tasks, tools, and interfaces in order to support the unique needs of learners: growth, diversity and motivation. This paper presents a framework for LCSD and describes a case study of its application to the ScienceWare Model-It, a learner-centered tool to support scientific modeling and…

  5. An updated model of induced airflow in the unsaturated zone

    USGS Publications Warehouse

    Baehr, Arthur L.; Joss, Craig J.

    1995-01-01

    Simulation of induced movement of air in the unsaturated zone provides a method to determine permeability and to design vapor extraction remediation systems. A previously published solution to the airflow equation for the case in which the unsaturated zone is separated from the atmosphere by a layer of lower permeability (such as a clay layer) has been superseded. The new solution simulates airflow through the layer of lower permeability more rigorously by defining the leakage in terms of the upper boundary condition rather than by adding a leakage term to the governing airflow equation. This note presents the derivation of the new solution. Formulas for steady state pressure, specific discharge, and mass flow in the domain are obtained for the new model and for the case in which the unsaturated zone is in direct contact with the atmosphere.

  6. Benchmarking of dynamic simulation predictions in two software platforms using an upper limb musculoskeletal model

    PubMed Central

    Saul, Katherine R.; Hu, Xiao; Goehler, Craig M.; Vidt, Meghan E.; Daly, Melissa; Velisar, Anca; Murray, Wendy M.

    2014-01-01

    Several opensource or commercially available software platforms are widely used to develop dynamic simulations of movement. While computational approaches are conceptually similar across platforms, technical differences in implementation may influence output. We present a new upper limb dynamic model as a tool to evaluate potential differences in predictive behavior between platforms. We evaluated to what extent differences in technical implementations in popular simulation software environments result in differences in kinematic predictions for single and multijoint movements using EMG- and optimization-based approaches for deriving control signals. We illustrate the benchmarking comparison using SIMM-Dynamics Pipeline-SD/Fast and OpenSim platforms. The most substantial divergence results from differences in muscle model and actuator paths. This model is a valuable resource and is available for download by other researchers. The model, data, and simulation results presented here can be used by future researchers to benchmark other software platforms and software upgrades for these two platforms. PMID:24995410

  7. Benchmarking of dynamic simulation predictions in two software platforms using an upper limb musculoskeletal model.

    PubMed

    Saul, Katherine R; Hu, Xiao; Goehler, Craig M; Vidt, Meghan E; Daly, Melissa; Velisar, Anca; Murray, Wendy M

    2015-01-01

    Several opensource or commercially available software platforms are widely used to develop dynamic simulations of movement. While computational approaches are conceptually similar across platforms, technical differences in implementation may influence output. We present a new upper limb dynamic model as a tool to evaluate potential differences in predictive behavior between platforms. We evaluated to what extent differences in technical implementations in popular simulation software environments result in differences in kinematic predictions for single and multijoint movements using EMG- and optimization-based approaches for deriving control signals. We illustrate the benchmarking comparison using SIMM-Dynamics Pipeline-SD/Fast and OpenSim platforms. The most substantial divergence results from differences in muscle model and actuator paths. This model is a valuable resource and is available for download by other researchers. The model, data, and simulation results presented here can be used by future researchers to benchmark other software platforms and software upgrades for these two platforms.

  8. Computing and software

    USGS Publications Warehouse

    White, Gary C.; Hines, J.E.

    2004-01-01

    The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methodsavailable. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them.In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented.Rotella et al. (2004) compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004).Efford et al. (2004) present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years.Barker & White (2004) discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood

  9. COMPARISON OF PBPK MODELING SOFTWARE FEATURES AND APPROACHES TO MODELING IMPORTNAT PHYSIOLOGICAL AND BIOCHEMICAL BEHAVIORS

    EPA Science Inventory

    Abstract for 40th Annual Meeting of the Society of Toxicology, March 25-29, 2001

    COMPARISON OF PBPK MODELING SOFTWARE FEATURES AND APPROACHES TO MODELING IMPORTANT PHYSIOLOGICAL AND BIOCHEMICAL BEHAVIORS. R S DeWoskin and R W Setzer. USEPA/ORD/NHEERL, RTP, NC, USA.

    ...

  10. Developing interpretable models with optimized set reduction for identifying high risk software components

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.

    1993-01-01

    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault frequency components so that testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents the Optimized Set Reduction approach for constructing such models, intended to fulfill specific software engineering needs. Our approach to classification is to measure the software system and build multivariate stochastic models for predicting high risk system components. We present experimental results obtained by classifying Ada components into two classes: is or is not likely to generate faults during system and acceptance test. Also, we evaluate the accuracy of the model and the insights it provides into the error making process.

  11. Exploring stop-go decision zones at rural high-speed intersections with flashing green signal and insufficient yellow time in China.

    PubMed

    Tang, Keshuang; Xu, Yanqing; Wang, Fen; Oguchi, Takashi

    2016-10-01

    The objective of this study is to empirically analyze and model the stop-go decision behavior of drivers at rural high-speed intersections in China, where a flashing green signal of 3s followed by a yellow signal of 3s is commonly applied to end a green phase. 1, 186 high-resolution vehicle trajectories were collected at four typical high-speed intersection approaches in Shanghai and used for the identification of actual stop-go decision zones and the modeling of stop-go decision behavior. Results indicate that the presence of flashing green significantly changed the theoretical decision zones based on the conventional Dilemma Zone theory. The actual stop-go decision zones at the study intersections were thus formulated and identified based on the empirical data. Binary Logistic model and Fuzzy Logic model were then developed to further explore the impacts of flashing green on the stop-go behavior of drivers. It was found that the Fuzzy Logic model could produce comparably good estimation results as compared to the traditional Binary Logistic models. The findings of this study could contribute the development of effective dilemma zone protection strategies, the improvement of stop-go decision model embedded in the microscopic traffic simulation software and the proper design of signal change and clearance intervals at high-speed intersections in China. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Practical Application of Model Checking in Software Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Skakkebaek, Jens Ulrik

    1999-01-01

    This paper presents our experiences in applying the JAVA PATHFINDER (J(sub PF)), a recently developed JAVA to SPIN translator, in the finding of synchronization bugs in a Chinese Chess game server application written in JAVA. We give an overview of J(sub PF) and the subset of JAVA that it supports and describe the abstraction and verification of the game server. Finally, we analyze the results of the effort. We argue that abstraction by under-approximation is necessary for abstracting sufficiently smaller models for verification purposes; that user guidance is crucial for effective abstraction; and that current model checkers do not conveniently support the computational models of software in general and JAVA in particular.

  13. Lenstronomy: Multi-purpose gravitational lens modeling software package

    NASA Astrophysics Data System (ADS)

    Birrer, Simon; Amara, Adam

    2018-04-01

    Lenstronomy is a multi-purpose open-source gravitational lens modeling python package. Lenstronomy reconstructs the lens mass and surface brightness distributions of strong lensing systems using forward modelling and supports a wide range of analytic lens and light models in arbitrary combination. The software is also able to reconstruct complex extended sources as well as point sources. Lenstronomy is flexible and numerically accurate, with a clear user interface that could be deployed across different platforms. Lenstronomy has been used to derive constraints on dark matter properties in strong lenses, measure the expansion history of the universe with time-delay cosmography, measure cosmic shear with Einstein rings, and decompose quasar and host galaxy light.

  14. Critical Factors Analysis for Offshore Software Development Success by Structural Equation Modeling

    NASA Astrophysics Data System (ADS)

    Wada, Yoshihisa; Tsuji, Hiroshi

    In order to analyze the success/failure factors in offshore software development service by the structural equation modeling, this paper proposes to follow two approaches together; domain knowledge based heuristic analysis and factor analysis based rational analysis. The former works for generating and verifying of hypothesis to find factors and causalities. The latter works for verifying factors introduced by theory to build the model without heuristics. Following the proposed combined approaches for the responses from skilled project managers of the questionnaire, this paper found that the vendor property has high causality for the success compared to software property and project property.

  15. Approach for delineation of contributing areas and zones of transport to selected public-supply wells using a regional ground-water flow model, Palm Beach County, Florida

    USGS Publications Warehouse

    Renken, R.A.; Patterson, R.D.; Orzol, L.L.; Dixon, Joann

    2001-01-01

    Rapid urban development and population growth in Palm Beach County, Florida, have been accompanied with the need for additional freshwater withdrawals from the surficial aquifer system. To maintain water quality, County officials protect capture areas and determine zones of transport of municipal supply wells. A multistep process was used to help automate the delineation of wellhead protection areas. A modular ground-water flow model (MODFLOW) Telescopic Mesh Refinement program (MODTMR) was used to construct an embedded flow model and combined with particle tracking to delineate zones of transport to supply wells; model output was coupled with a geographic information system. An embedded flow MODFLOW model was constructed using input and output file data from a preexisting three-dimensional, calibrated model of the surficial aquifer system. Three graphical user interfaces for use with the geographic information software, ArcView, were developed to enhance the telescopic mesh refinement process. These interfaces include AvMODTMR for use with MODTMR; AvHDRD to build MODFLOW river and drain input files from dynamically segmented linear (canals) data sets; and AvWELL Refiner, an interface designed to examine and convert well coverage spatial data layers to a MODFLOW Well package input file. MODPATH (the U.S. Geological Survey particle-tracking postprocessing program) and MODTOOLS (the set of U.S. Geological Survey computer programs to translate MODFLOW and MODPATH output to a geographic information system) were used to map zones of transport. A steady-state, five-layer model of the Boca Raton area was created using the telescopic mesh refinement process and calibrated to average conditions during January 1989 to June 1990. A sensitivity analysis of various model parameters indicates that the model is most sensitive to changes in recharge rates, hydraulic conductivity for layer 1, and leakance for layers 3 and 4 (Biscayne aquifer). Recharge (58 percent); river (canal

  16. Comparative Analyses of MIRT Models and Software (BMIRT and flexMIRT)

    ERIC Educational Resources Information Center

    Yavuz, Guler; Hambleton, Ronald K.

    2017-01-01

    Application of MIRT modeling procedures is dependent on the quality of parameter estimates provided by the estimation software and techniques used. This study investigated model parameter recovery of two popular MIRT packages, BMIRT and flexMIRT, under some common measurement conditions. These packages were specifically selected to investigate the…

  17. Utility of coupling nonlinear optimization methods with numerical modeling software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, M.J.

    1996-08-05

    Results of using GLO (Global Local Optimizer), a general purpose nonlinear optimization software package for investigating multi-parameter problems in science and engineering is discussed. The package consists of the modular optimization control system (GLO), a graphical user interface (GLO-GUI), a pre-processor (GLO-PUT), a post-processor (GLO-GET), and nonlinear optimization software modules, GLOBAL & LOCAL. GLO is designed for controlling and easy coupling to any scientific software application. GLO runs the optimization module and scientific software application in an iterative loop. At each iteration, the optimization module defines new values for the set of parameters being optimized. GLO-PUT inserts the new parametermore » values into the input file of the scientific application. GLO runs the application with the new parameter values. GLO-GET determines the value of the objective function by extracting the results of the analysis and comparing to the desired result. GLO continues to run the scientific application over and over until it finds the ``best`` set of parameters by minimizing (or maximizing) the objective function. An example problem showing the optimization of material model is presented (Taylor cylinder impact test).« less

  18. Validating modelled variable surface saturation in the riparian zone with thermal infrared images

    NASA Astrophysics Data System (ADS)

    Glaser, Barbara; Klaus, Julian; Frei, Sven; Frentress, Jay; Pfister, Laurent; Hopp, Luisa

    2015-04-01

    Variable contributing areas and hydrological connectivity have become prominent new concepts for hydrologic process understanding in recent years. The dynamic connectivity within the hillslope-riparian-stream (HRS) system is known to have a first order control on discharge generation and especially the riparian zone functions as runoff buffering or producing zone. However, despite their importance, the highly dynamic processes of contraction and extension of saturation within the riparian zone and its impact on runoff generation still remain not fully understood. In this study, we analysed the potential of a distributed, fully coupled and physically based model (HydroGeoSphere) to represent the spatial and temporal water flux dynamics of a forested headwater HRS system (6 ha) in western Luxembourg. The model was set up and parameterised under consideration of experimentally-derived knowledge of catchment structure and was run for a period of four years (October 2010 to August 2014). For model evaluation, we especially focused on the temporally varying spatial patterns of surface saturation. We used ground-based thermal infrared (TIR) imagery to map surface saturation with a high spatial and temporal resolution and collected 20 panoramic snapshots of the riparian zone (ca. 10 by 20 m) under different hydrologic conditions. These TIR panoramas were used in addition to several classical discharge and soil moisture time series for a spatially-distributed model validation. In a manual calibration process we optimised model parameters (e.g. porosity, saturated hydraulic conductivity, evaporation depth) to achieve a better agreement between observed and modelled discharges and soil moistures. The subsequent validation of surface saturation patterns by a visual comparison of processed TIR panoramas and corresponding model output panoramas revealed an overall good accordance for all but one region that was always too dry in the model. However, quantitative comparisons of

  19. Nonequilibrium thermodynamics of the shear-transformation-zone model

    NASA Astrophysics Data System (ADS)

    Luo, Alan M.; Ã-ttinger, Hans Christian

    2014-02-01

    The shear-transformation-zone (STZ) model has been applied numerous times to describe the plastic deformation of different types of amorphous systems. We formulate this model within the general equation for nonequilibrium reversible-irreversible coupling (GENERIC) framework, thereby clarifying the thermodynamic structure of the constitutive equations and guaranteeing thermodynamic consistency. We propose natural, physically motivated forms for the building blocks of the GENERIC, which combine to produce a closed set of time evolution equations for the state variables, valid for any choice of free energy. We demonstrate an application of the new GENERIC-based model by choosing a simple form of the free energy. In addition, we present some numerical results and contrast those with the original STZ equations.

  20. Fault zone hydrogeology

    NASA Astrophysics Data System (ADS)

    Bense, V. F.; Gleeson, T.; Loveless, S. E.; Bour, O.; Scibek, J.

    2013-12-01

    Deformation along faults in the shallow crust (< 1 km) introduces permeability heterogeneity and anisotropy, which has an important impact on processes such as regional groundwater flow, hydrocarbon migration, and hydrothermal fluid circulation. Fault zones have the capacity to be hydraulic conduits connecting shallow and deep geological environments, but simultaneously the fault cores of many faults often form effective barriers to flow. The direct evaluation of the impact of faults to fluid flow patterns remains a challenge and requires a multidisciplinary research effort of structural geologists and hydrogeologists. However, we find that these disciplines often use different methods with little interaction between them. In this review, we document the current multi-disciplinary understanding of fault zone hydrogeology. We discuss surface- and subsurface observations from diverse rock types from unlithified and lithified clastic sediments through to carbonate, crystalline, and volcanic rocks. For each rock type, we evaluate geological deformation mechanisms, hydrogeologic observations and conceptual models of fault zone hydrogeology. Outcrop observations indicate that fault zones commonly have a permeability structure suggesting they should act as complex conduit-barrier systems in which along-fault flow is encouraged and across-fault flow is impeded. Hydrogeological observations of fault zones reported in the literature show a broad qualitative agreement with outcrop-based conceptual models of fault zone hydrogeology. Nevertheless, the specific impact of a particular fault permeability structure on fault zone hydrogeology can only be assessed when the hydrogeological context of the fault zone is considered and not from outcrop observations alone. To gain a more integrated, comprehensive understanding of fault zone hydrogeology, we foresee numerous synergistic opportunities and challenges for the discipline of structural geology and hydrogeology to co-evolve and

  1. A Decision Model for Supporting Task Allocation Processes in Global Software Development

    NASA Astrophysics Data System (ADS)

    Lamersdorf, Ansgar; Münch, Jürgen; Rombach, Dieter

    Today, software-intensive systems are increasingly being developed in a globally distributed way. However, besides its benefit, global development also bears a set of risks and problems. One critical factor for successful project management of distributed software development is the allocation of tasks to sites, as this is assumed to have a major influence on the benefits and risks. We introduce a model that aims at improving management processes in globally distributed projects by giving decision support for task allocation that systematically regards multiple criteria. The criteria and causal relationships were identified in a literature study and refined in a qualitative interview study. The model uses existing approaches from distributed systems and statistical modeling. The article gives an overview of the problem and related work, introduces the empirical and theoretical foundations of the model, and shows the use of the model in an example scenario.

  2. Statistic inversion of multi-zone transition probability models for aquifer characterization in alluvial fans

    DOE PAGES

    Zhu, Lin; Dai, Zhenxue; Gong, Huili; ...

    2015-06-12

    Understanding the heterogeneity arising from the complex architecture of sedimentary sequences in alluvial fans is challenging. This study develops a statistical inverse framework in a multi-zone transition probability approach for characterizing the heterogeneity in alluvial fans. An analytical solution of the transition probability matrix is used to define the statistical relationships among different hydrofacies and their mean lengths, integral scales, and volumetric proportions. A statistical inversion is conducted to identify the multi-zone transition probability models and estimate the optimal statistical parameters using the modified Gauss–Newton–Levenberg–Marquardt method. The Jacobian matrix is computed by the sensitivity equation method, which results in anmore » accurate inverse solution with quantification of parameter uncertainty. We use the Chaobai River alluvial fan in the Beijing Plain, China, as an example for elucidating the methodology of alluvial fan characterization. The alluvial fan is divided into three sediment zones. In each zone, the explicit mathematical formulations of the transition probability models are constructed with optimized different integral scales and volumetric proportions. The hydrofacies distributions in the three zones are simulated sequentially by the multi-zone transition probability-based indicator simulations. Finally, the result of this study provides the heterogeneous structure of the alluvial fan for further study of flow and transport simulations.« less

  3. Virtual Systems Pharmacology (ViSP) software for simulation from mechanistic systems-level models.

    PubMed

    Ermakov, Sergey; Forster, Peter; Pagidala, Jyotsna; Miladinov, Marko; Wang, Albert; Baillie, Rebecca; Bartlett, Derek; Reed, Mike; Leil, Tarek A

    2014-01-01

    Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore, it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user's particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients.

  4. Virtual Systems Pharmacology (ViSP) software for simulation from mechanistic systems-level models

    PubMed Central

    Ermakov, Sergey; Forster, Peter; Pagidala, Jyotsna; Miladinov, Marko; Wang, Albert; Baillie, Rebecca; Bartlett, Derek; Reed, Mike; Leil, Tarek A.

    2014-01-01

    Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore, it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user's particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients. PMID:25374542

  5. Models Extracted from Text for System-Software Safety Analyses

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2010-01-01

    This presentation describes extraction and integration of requirements information and safety information in visualizations to support early review of completeness, correctness, and consistency of lengthy and diverse system safety analyses. Software tools have been developed and extended to perform the following tasks: 1) extract model parts and safety information from text in interface requirements documents, failure modes and effects analyses and hazard reports; 2) map and integrate the information to develop system architecture models and visualizations for safety analysts; and 3) provide model output to support virtual system integration testing. This presentation illustrates the methods and products with a rocket motor initiation case.

  6. Modeling the Effects of Hydrogeomorphology and Climactic Factors on Nitrogen, Phosphorus, and Greenhouse Gas Dynamics in Riparian Zones.

    NASA Astrophysics Data System (ADS)

    Hassanzadeh, Y.; Vidon, P.; Gold, A.; Pradhanang, S. M.; Addy, K.

    2017-12-01

    Vegetated riparian zones are often considered for use as best management practices to mitigate the impacts of agriculture on water quality. However, riparian zones can also be a source of greenhouse gases and their influence on water quality varies depending on landscape hydrogeomorphic characteristics and climate. Methods used to evaluate riparian zone functions include conceptual models, and spatially explicit and process based models (REMM), but very few attempts have been made to connect riparian zone characteristics with function using easily accessible landscape scale data. Here, we present comprehensive statistical models that can be used to assess riparian zone functions with easily obtainable landscape-scale hydrogeomorphic attributes and climate data. Models were developed from a database spanning 88 years and 36 sites. Statistical methods including principal component analysis and stepwise regression were used to reduced data dimensionality and identify significant predictors. Models were validated using additional data collected from scientific literature. The 8 models developed connect landscape characteristics to nitrogen and phosphorus concentration and removal (1-4), greenhouse gas emissions (5-7), and water table depth (8). Results show the range of influence that various climate and landscape characteristics have on riparian zone functions, and the tradeoffs that exist with regards to nitrogen, phosphorous, and greenhouse gases. These models will help reduce the need for extensive field measurements and help scientists and land managers make more informed decisions regarding the use of riparian zones for water quality management.

  7. Software engineering laboratory series: Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Valett, Jon

    1992-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) the Software Engineering Laboratory; (2) the Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.

  8. A client-server software for the identification of groundwater vulnerability to pesticides at regional level.

    PubMed

    Di Guardo, Andrea; Finizio, Antonio

    2015-10-15

    The groundwater VULnerability to PESticide software system (VULPES) is a user-friendly, GIS-based and client-server software developed to identify vulnerable areas to pesticides at regional level making use of pesticide fate models. It is a Decision Support System aimed to assist the public policy makers to investigate areas sensitive to specific substances and to propose limitations of use or mitigation measures. VULPES identify the so-called Uniform Geographical Unit (UGU) which are areas characterised by the same agro-environmental conditions. In each UGU it applies the PELMO model obtaining the 80th percentile of the substance concentration at 1 metre depth; then VULPES creates a vulnerability map in shapefile format which classifies the outputs comparing them with the lower threshold set to the legal limit concentration in groundwater (0.1 μg/l). This paper describes the software structure in details and a case study with the application of the terbuthylazine herbicide on the Lombardy region territory. Three zones with different degrees of vulnerabilities has been identified and described. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Software for browsing sectioned images of a dog body and generating a 3D model.

    PubMed

    Park, Jin Seo; Jung, Yong Wook

    2016-01-01

    The goals of this study were (1) to provide accessible and instructive browsing software for sectioned images and a portable document format (PDF) file that includes three-dimensional (3D) models of an entire dog body and (2) to develop techniques for segmentation and 3D modeling that would enable an investigator to perform these tasks without the aid of a computer engineer. To achieve these goals, relatively important or large structures in the sectioned images were outlined to generate segmented images. The sectioned and segmented images were then packaged into browsing software. In this software, structures in the sectioned images are shown in detail and in real color. After 3D models were made from the segmented images, the 3D models were exported into a PDF file. In this format, the 3D models could be manipulated freely. The browsing software and PDF file are available for study by students, for lecture for teachers, and for training for clinicians. These files will be helpful for anatomical study by and clinical training of veterinary students and clinicians. Furthermore, these techniques will be useful for researchers who study two-dimensional images and 3D models. © 2015 Wiley Periodicals, Inc.

  10. Critical Thinking Skills of Students through Mathematics Learning with ASSURE Model Assisted by Software Autograph

    NASA Astrophysics Data System (ADS)

    Kristianti, Y.; Prabawanto, S.; Suhendra, S.

    2017-09-01

    This study aims to examine the ability of critical thinking and students who attain learning mathematics with learning model ASSURE assisted Autograph software. The design of this study was experimental group with pre-test and post-test control group. The experimental group obtained a mathematics learning with ASSURE-assisted model Autograph software and the control group acquired the mathematics learning with the conventional model. The data are obtained from the research results through critical thinking skills tests. This research was conducted at junior high school level with research population in one of junior high school student in Subang Regency of Lesson Year 2016/2017 and research sample of class VIII student in one of junior high school in Subang Regency for 2 classes. Analysis of research data is administered quantitatively. Quantitative data analysis was performed on the normalized gain level between the two sample groups using a one-way anova test. The results show that mathematics learning with ASSURE assisted model Autograph software can improve the critical thinking ability of junior high school students. Mathematical learning using ASSURE-assisted model Autograph software is significantly better in improving the critical thinking skills of junior high school students compared with conventional models.

  11. Mojave Compliant Zone Structure and Properties: Constraints from InSAR and Mechanical Models

    NASA Astrophysics Data System (ADS)

    Hearn, E. H.; Fialko, Y.; Finzi, Y.

    2007-12-01

    Long-lived zones with significantly lower elastic strength than their surroundings are associated with active Mojave faults (e.g., Li et al., 1999; Fialko et al., 2002, 2004). In an earthquake these weak features concentrate strain, causing them to show up as anomalous, short length-scale features in SAR interferograms (Fialko et al., 2002). Fault-zone trapped wave studies indicate that the 1999 Hector Mine earthquake caused a small reduction in P- and S-wave velocities in a compliant zone along the Landers earthquake rupture (Vidale and Li, 2003). This suggests that coseismic strain concentration, and the resulting damage, in the compliant zone caused a further reduction in its elastic strength. Even a small coseismic strength drop should make a compliant zone (CZ) deform, in response to the total (not just the coseismic) stress. The strain should be in the sense which is compatible with the orientations and values of the region's principal stresses. However, as indicated by Fialko and co-workers (2002, 2004), the sense of coseismic strain of Mojave compliant zones was consistent with coseismic stress change, not the regional (background) stress. Here we use finite-element models to investigate how InSAR measurements of Mojave compliant zone coseismic strain places limits on their dimensions and on upper crustal stresses. We find that unless the CZ is shallow, narrow, and has a high Poisson's ratio (e.g., 0.4), CZ contraction under lithostatic stress overshadows deformation due to deviatoric background stress or coseismic stress change. We present ranges of CZ dimensions which are compatible with the observed surface deformation and address how these dimensions compare with new results from damage-controlled fault evolution models.

  12. Program SPACECAP: software for estimating animal density using spatially explicit capture-recapture models

    USGS Publications Warehouse

    Gopalaswamy, Arjun M.; Royle, J. Andrew; Hines, James E.; Singh, Pallavi; Jathanna, Devcharan; Kumar, N. Samba; Karanth, K. Ullas

    2012-01-01

    1. The advent of spatially explicit capture-recapture models is changing the way ecologists analyse capture-recapture data. However, the advantages offered by these new models are not fully exploited because they can be difficult to implement. 2. To address this need, we developed a user-friendly software package, created within the R programming environment, called SPACECAP. This package implements Bayesian spatially explicit hierarchical models to analyse spatial capture-recapture data. 3. Given that a large number of field biologists prefer software with graphical user interfaces for analysing their data, SPACECAP is particularly useful as a tool to increase the adoption of Bayesian spatially explicit capture-recapture methods in practice.

  13. A Practical Guide to Calibration of a GSSHA Hydrologic Model Using ERDC Automated Model Calibration Software - Effective and Efficient Stochastic Global Optimization

    DTIC Science & Technology

    2012-02-01

    parameter estimation method, but rather to carefully describe how to use the ERDC software implementation of MLSL that accommodates the PEST model...model independent LM method based parameter estimation software PEST (Doherty, 2004, 2007a, 2007b), which quantifies model to measure- ment misfit...et al. (2011) focused on one drawback associated with LM-based model independent parameter estimation as implemented in PEST ; viz., that it requires

  14. A new predictive multi-zone model for HCCI engine combustion

    DOE PAGES

    Bissoli, Mattia; Frassoldati, Alessio; Cuoci, Alberto; ...

    2016-06-30

    Here, this work introduces a new predictive multi-zone model for the description of combustion in Homogeneous Charge Compression Ignition (HCCI) engines. The model exploits the existing OpenSMOKE++ computational suite to handle detailed kinetic mechanisms, providing reliable predictions of the in-cylinder auto-ignition processes. All the elements with a significant impact on the combustion performances and emissions, like turbulence, heat and mass exchanges, crevices, residual burned gases, thermal and feed stratification are taken into account. Compared to other computational approaches, this model improves the description of mixture stratification phenomena by coupling a wall heat transfer model derived from CFD application with amore » proper turbulence model. Furthermore, the calibration of this multi-zone model requires only three parameters, which can be derived from a non-reactive CFD simulation: these adaptive variables depend only on the engine geometry and remain fixed across a wide range of operating conditions, allowing the prediction of auto-ignition, pressure traces and pollutants. This computational framework enables the use of detail kinetic mechanisms, as well as Rate of Production Analysis (RoPA) and Sensitivity Analysis (SA) to investigate the complex chemistry involved in the auto-ignition and the pollutants formation processes. In the final sections of the paper, these capabilities are demonstrated through the comparison with experimental data.« less

  15. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  16. Quantifying Hydro-biogeochemical Model Sensitivity in Assessment of Climate Change Effect on Hyporheic Zone Processes

    NASA Astrophysics Data System (ADS)

    Song, X.; Chen, X.; Dai, H.; Hammond, G. E.; Song, H. S.; Stegen, J.

    2016-12-01

    The hyporheic zone is an active region for biogeochemical processes such as carbon and nitrogen cycling, where the groundwater and surface water mix and interact with each other with distinct biogeochemical and thermal properties. The biogeochemical dynamics within the hyporheic zone are driven by both river water and groundwater hydraulic dynamics, which are directly affected by climate change scenarios. Besides that, the hydraulic and thermal properties of local sediments and microbial and chemical processes also play important roles in biogeochemical dynamics. Thus for a comprehensive understanding of the biogeochemical processes in the hyporheic zone, a coupled thermo-hydro-biogeochemical model is needed. As multiple uncertainty sources are involved in the integrated model, it is important to identify its key modules/parameters through sensitivity analysis. In this study, we develop a 2D cross-section model in the hyporheic zone at the DOE Hanford site adjacent to Columbia River and use this model to quantify module and parametric sensitivity on assessment of climate change. To achieve this purpose, We 1) develop a facies-based groundwater flow and heat transfer model that incorporates facies geometry and heterogeneity characterized from a field data set, 2) derive multiple reaction networks/pathways from batch experiments with in-situ samples and integrate temperate dependent reactive transport modules to the flow model, 3) assign multiple climate change scenarios to the coupled model by analyzing historical river stage data, 4) apply a variance-based global sensitivity analysis to quantify scenario/module/parameter uncertainty in hierarchy level. The objectives of the research include: 1) identifing the key control factors of the coupled thermo-hydro-biogeochemical model in the assessment of climate change, and 2) quantify the carbon consumption in different climate change scenarios in the hyporheic zone.

  17. Software Engineering Education Directory

    DTIC Science & Technology

    1990-04-01

    and Engineering (CMSC 735) Codes: GPEV2 * Textiooks: IEEE Tutoria on Models and Metrics for Software Management and Engameeing by Basi, Victor R...Software Engineering (Comp 227) Codes: GPRY5 Textbooks: IEEE Tutoria on Software Design Techniques by Freeman, Peter and Wasserman, Anthony 1. Software

  18. Space-Shuttle Emulator Software

    NASA Technical Reports Server (NTRS)

    Arnold, Scott; Askew, Bill; Barry, Matthew R.; Leigh, Agnes; Mermelstein, Scott; Owens, James; Payne, Dan; Pemble, Jim; Sollinger, John; Thompson, Hiram; hide

    2007-01-01

    A package of software has been developed to execute a raw binary image of the space shuttle flight software for simulation of the computational effects of operation of space shuttle avionics. This software can be run on inexpensive computer workstations. Heretofore, it was necessary to use real flight computers to perform such tests and simulations. The package includes a program that emulates the space shuttle orbiter general- purpose computer [consisting of a central processing unit (CPU), input/output processor (IOP), master sequence controller, and buscontrol elements]; an emulator of the orbiter display electronics unit and models of the associated cathode-ray tubes, keyboards, and switch controls; computational models of the data-bus network; computational models of the multiplexer-demultiplexer components; an emulation of the pulse-code modulation master unit; an emulation of the payload data interleaver; a model of the master timing unit; a model of the mass memory unit; and a software component that ensures compatibility of telemetry and command services between the simulated space shuttle avionics and a mission control center. The software package is portable to several host platforms.

  19. Transfer zones and fault reactivation in inverted rift basins: Insights from physical modelling

    NASA Astrophysics Data System (ADS)

    Konstantinovskaya, Elena A.; Harris, Lyal B.; Poulin, Jimmy; Ivanov, Gennady M.

    2007-08-01

    Lateral transfer zones of deformation and fault reactivation were investigated in multilayered silicone-sand models during extension and subsequent co-axial shortening. Model materials were selected to meet similarity criteria and to be distinguished on CT scans; this approach permitted non-destructive visualisation of the progressive evolution of structures. Transfer zones were initiated by an orthogonal offset in the geometry of a basal mobile aluminium sheet and/or by variations of layer thickness or material rheology in basal layers. Transfer zones affected rift propagation and fault kinematics in models. Propagation and overlapping rift culminations occurred in transfer zones during extension. During shortening, deviation in the orientation of frontal thrusts and fold axes occurred within transfer zones in brittle and ductile layers, respectively. CT scans showed that steep (58-67°) rift-margin normal faults were reactivated as reverse faults. The reactivated faults rotated to shallower dips (19-38°) with continuing shortening after 100% inversion. Rotation of rift phase faults appears to be due to deep level folding and uplift during the inversion phase. New thrust faults with shallow dips (20-34°) formed outside the inverted graben at late stages of shortening. Frontal ramps propagated laterally past the transfer structure during shortening. During inversion, the layers filling the rift structures underwent lateral compression at the depth, the graben fill was pushed up and outwards creating local extension near the surface. Sand marker layers in inverted graben have showed fold-like structures or rotation and tilting in the rifts and on the rift margins. The results of our experiments conform well to natural examples of inverted graben. Inverted rift basins are structurally complex and often difficult to interpret in seismic data. The models may help to unravel the structure and evolution of these systems, leading to improved hydrocarbon exploration

  20. Co Modeling and Co Synthesis of Safety Critical Multi threaded Embedded Software for Multi Core Embedded Platforms

    DTIC Science & Technology

    2017-03-20

    computation, Prime Implicates, Boolean Abstraction, real- time embedded software, software synthesis, correct by construction software design , model...types for time -dependent data-flow networks". J.-P. Talpin, P. Jouvelot, S. Shukla. ACM-IEEE Conference on Methods and Models for System Design ...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing   data sources, gathering and

  1. Modelface: an Application Programming Interface (API) for Homology Modeling Studies Using Modeller Software

    PubMed Central

    Sakhteman, Amirhossein; Zare, Bijan

    2016-01-01

    An interactive application, Modelface, was presented for Modeller software based on windows platform. The application is able to run all steps of homology modeling including pdb to fasta generation, running clustal, model building and loop refinement. Other modules of modeler including energy calculation, energy minimization and the ability to make single point mutations in the PDB structures are also implemented inside Modelface. The API is a simple batch based application with no memory occupation and is free of charge for academic use. The application is also able to repair missing atom types in the PDB structures making it suitable for many molecular modeling studies such as docking and molecular dynamic simulation. Some successful instances of modeling studies using Modelface are also reported. PMID:28243276

  2. COMPILATION OF SATURATED AND UNSATURATED ZONE MODELING SOFTWARE (EPA/600/SR-96/009)

    EPA Science Inventory

    The study reflects the ongoing groundwater modeling information collection and processing activities at the International Ground Water Modeling Center (IGWMC). The full report briefly discusses the information acquisition and processing procedures, the MARS information database, ...

  3. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  4. Predicting Biological Information Flow in a Model Oxygen Minimum Zone

    NASA Astrophysics Data System (ADS)

    Louca, S.; Hawley, A. K.; Katsev, S.; Beltran, M. T.; Bhatia, M. P.; Michiels, C.; Capelle, D.; Lavik, G.; Doebeli, M.; Crowe, S.; Hallam, S. J.

    2016-02-01

    Microbial activity drives marine biochemical fluxes and nutrient cycling at global scales. Geochemical measurements as well as molecular techniques such as metagenomics, metatranscriptomics and metaproteomics provide great insight into microbial activity. However, an integration of molecular and geochemical data into mechanistic biogeochemical models is still lacking. Recent work suggests that microbial metabolic pathways are, at the ecosystem level, strongly shaped by stoichiometric and energetic constraints. Hence, models rooted in fluxes of matter and energy may yield a holistic understanding of biogeochemistry. Furthermore, such pathway-centric models would allow a direct consolidation with meta'omic data. Here we present a pathway-centric biogeochemical model for the seasonal oxygen minimum zone in Saanich Inlet, a fjord off the coast of Vancouver Island. The model considers key dissimilatory nitrogen and sulfur fluxes, as well as the population dynamics of the genes that mediate them. By assuming a direct translation of biocatalyzed energy fluxes to biosynthesis rates, we make predictions about the distribution and activity of the corresponding genes. A comparison of the model to molecular measurements indicates that the model explains observed DNA, RNA, protein and cell depth profiles. This suggests that microbial activity in marine ecosystems such as oxygen minimum zones is well described by DNA abundance, which, in conjunction with geochemical constraints, determines pathway expression and process rates. Our work further demonstrates how meta'omic data can be mechanistically linked to environmental redox conditions and biogeochemical processes.

  5. Quantifying Uncertainty in Inverse Models of Geologic Data from Shear Zones

    NASA Astrophysics Data System (ADS)

    Davis, J. R.; Titus, S.

    2016-12-01

    We use Bayesian Markov chain Monte Carlo simulation to quantify uncertainty in inverse models of geologic data. Although this approach can be applied to many tectonic settings, field areas, and mathematical models, we focus on transpressional shear zones. The underlying forward model, either kinematic or dynamic, produces a velocity field, which predicts the dikes, foliation-lineations, crystallographic preferred orientation (CPO), shape preferred orientation (SPO), and other geologic data that should arise in the shear zone. These predictions are compared to data using modern methods of geometric statistics, including the Watson (for lines such as dike poles), isotropic matrix Fisher (for orientations such as foliation-lineations and CPO), and multivariate normal (for log-ellipsoids such as SPO) distributions. The result of the comparison is a likelihood, which is a key ingredient in the Bayesian approach. The other key ingredient is a prior distribution, which reflects the geologist's knowledge of the parameters before seeing the data. For some parameters, such as shear zone strike and dip, we identify realistic informative priors. For other parameters, where the geologist has no prior knowledge, we identify useful uninformative priors.We investigate the performance of this approach through numerical experiments on synthetic data sets. A fundamental issue is that many models of deformation exhibit asymptotic behavior (e.g., flow apophyses, fabric attractors) or periodic behavior (e.g., SPO when the clasts are rigid), which causes the likelihood to be too uniform. Based on our experiments, we offer rules of thumb for how many data, of which types, are needed to constrain deformation.

  6. Software reliability: Application of a reliability model to requirements error analysis

    NASA Technical Reports Server (NTRS)

    Logan, J.

    1980-01-01

    The application of a software reliability model having a well defined correspondence of computer program properties to requirements error analysis is described. Requirements error categories which can be related to program structural elements are identified and their effect on program execution considered. The model is applied to a hypothetical B-5 requirement specification for a program module.

  7. Software forecasting as it is really done: A study of JPL software engineers

    NASA Technical Reports Server (NTRS)

    Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.

    1993-01-01

    This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.

  8. Software-defined Quantum Networking Ecosystem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Sadlier, Ronald

    The software enables a user to perform modeling and simulation of software-defined quantum networks. The software addresses the problem of how to synchronize transmission of quantum and classical signals through multi-node networks and to demonstrate quantum information protocols such as quantum teleportation. The software approaches this problem by generating a graphical model of the underlying network and attributing properties to each node and link in the graph. The graphical model is then simulated using a combination of discrete-event simulators to calculate the expected state of each node and link in the graph at a future time. A user interacts withmore » the software by providing an initial network model and instantiating methods for the nodes to transmit information with each other. This includes writing application scripts in python that make use of the software library interfaces. A user then initiates the application scripts, which invokes the software simulation. The user then uses the built-in diagnostic tools to query the state of the simulation and to collect statistics on synchronization.« less

  9. Zoning method for environmental engineering geological patterns in underground coal mining areas.

    PubMed

    Liu, Shiliang; Li, Wenping; Wang, Qiqing

    2018-09-01

    Environmental engineering geological patterns (EEGPs) are used to express the trend and intensity of eco-geological environment caused by mining in underground coal mining areas, a complex process controlled by multiple factors. A new zoning method for EEGPs was developed based on the variable-weight theory (VWT), where the weights of factors vary with their value. The method was applied to the Yushenfu mining area, Shaanxi, China. First, the mechanism of the EEGPs caused by mining was elucidated, and four types of EEGPs were proposed. Subsequently, 13 key control factors were selected from mining conditions, lithosphere, hydrosphere, ecosphere, and climatic conditions; their thematic maps were constructed using ArcGIS software and remote-sensing technologies. Then, a stimulation-punishment variable-weight model derived from the partition of basic evaluation unit of study area, construction of partition state-variable-weight vector, and determination of variable-weight interval was built to calculate the variable weights of each factor. On this basis, a zoning mathematical model of EEGPs was established, and the zoning results were analyzed. For comparison, the traditional constant-weight theory (CWT) was also applied to divide the EEGPs. Finally, the zoning results obtained using VWT and CWT were compared. The verification of field investigation indicates that VWT is more accurate and reliable than CWT. The zoning results are consistent with the actual situations and the key of planning design for the rational development of coal resources and protection of eco-geological environment. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. A novel breast software phantom for biomechanical modeling of elastography.

    PubMed

    Bhatti, Syeda Naema; Sridhar-Keralapura, Mallika

    2012-04-01

    In developing breast imaging technologies, testing is done with phantoms. Physical phantoms are normally used but their size, shape, composition, and detail cannot be modified readily. These difficulties can be avoided by creating a software breast phantom. Researchers have created software breast phantoms using geometric and/or mathematical methods for applications like image fusion. The authors report a 3D software breast phantom that was built using a mechanical design tool, to investigate the biomechanics of elastography using finite element modeling (FEM). The authors propose this phantom as an intermediate assessment tool for elastography simulation; for use after testing with commonly used phantoms and before clinical testing. The authors design the phantom to be flexible in both, the breast geometry and biomechanical parameters, to make it a useful tool for elastography simulation. The authors develop the 3D software phantom using a mechanical design tool based on illustrations of normal breast anatomy. The software phantom does not use geometric primitives or imaging data. The authors discuss how to create this phantom and how to modify it. The authors demonstrate a typical elastography experiment of applying a static stress to the top surface of the breast just above a simulated tumor and calculate normal strains in 3D and in 2D with plane strain approximations with linear solvers. In particular, they investigate contrast transfer efficiency (CTE) by designing a parametric study based on location, shape, and stiffness of simulated tumors. The authors also compare their findings to a commonly used elastography phantom. The 3D breast software phantom is flexible in shape, size, and location of tumors, glandular to fatty content, and the ductal structure. Residual modulus, maps, and profiles, served as a guide to optimize meshing of this geometrically nonlinear phantom for biomechanical modeling of elastography. At best, low residues (around 1-5 KPa) were

  11. Time-lapse gravity data for monitoring and modeling artificial recharge through a thick unsaturated zone

    NASA Astrophysics Data System (ADS)

    Kennedy, Jeffrey; Ferré, Ty P. A.; Creutzfeldt, Benjamin

    2016-09-01

    Groundwater-level measurements in monitoring wells or piezometers are the most common, and often the only, hydrologic measurements made at artificial recharge facilities. Measurements of gravity change over time provide an additional source of information about changes in groundwater storage, infiltration, and for model calibration. We demonstrate that for an artificial recharge facility with a deep groundwater table, gravity data are more sensitive to movement of water through the unsaturated zone than are groundwater levels. Groundwater levels have a delayed response to infiltration, change in a similar manner at many potential monitoring locations, and are heavily influenced by high-frequency noise induced by pumping; in contrast, gravity changes start immediately at the onset of infiltration and are sensitive to water in the unsaturated zone. Continuous gravity data can determine infiltration rate, and the estimate is only minimally affected by uncertainty in water-content change. Gravity data are also useful for constraining parameters in a coupled groundwater-unsaturated zone model (Modflow-NWT model with the Unsaturated Zone Flow (UZF) package).

  12. Time-lapse gravity data for monitoring and modeling artificial recharge through a thick unsaturated zone

    USGS Publications Warehouse

    Kennedy, Jeffrey R.; Ferre, Ty P.A.; Creutzfeldt, Benjamin

    2016-01-01

    Groundwater-level measurements in monitoring wells or piezometers are the most common, and often the only, hydrologic measurements made at artificial recharge facilities. Measurements of gravity change over time provide an additional source of information about changes in groundwater storage, infiltration, and for model calibration. We demonstrate that for an artificial recharge facility with a deep groundwater table, gravity data are more sensitive to movement of water through the unsaturated zone than are groundwater levels. Groundwater levels have a delayed response to infiltration, change in a similar manner at many potential monitoring locations, and are heavily influenced by high-frequency noise induced by pumping; in contrast, gravity changes start immediately at the onset of infiltration and are sensitive to water in the unsaturated zone. Continuous gravity data can determine infiltration rate, and the estimate is only minimally affected by uncertainty in water-content change. Gravity data are also useful for constraining parameters in a coupled groundwater-unsaturated zone model (Modflow-NWT model with the Unsaturated Zone Flow (UZF) package).

  13. Hydro-geophysical observations integration in numerical model: case study in Mediterranean karstic unsaturated zone (Larzac, france)

    NASA Astrophysics Data System (ADS)

    Champollion, Cédric; Fores, Benjamin; Le Moigne, Nicolas; Chéry, Jean

    2016-04-01

    Karstic hydro-systems are highly non-linear and heterogeneous but one of the main water resource in the Mediterranean area. Neither local measurements in boreholes or analysis at the spring can take into account the variability of the water storage. Since a few years, ground-based geophysical measurements (such as gravity, electrical resistivity or seismological data) allows following water storage in heterogeneous hydrosystems at an intermediate scale between boreholes and basin. Behind classical rigorous monitoring, the integration of geophysical data in hydrological numerical models in needed for both processes interpretation and quantification. Since a few years, a karstic geophysical observatory (GEK: Géodésie de l'Environnement Karstique, OSU OREME, SNO H+) has been setup in the Mediterranean area in the south of France. The observatory is surrounding more than 250m karstified dolomite, with an unsaturated zone of ~150m thickness. At the observatory water level in boreholes, evapotranspiration and rainfall are classical hydro-meteorological observations completed by continuous gravity, resistivity and seismological measurements. The main objective of the study is the modelling of the whole observation dataset by explicit unsaturated numerical model in one dimension. Hydrus software is used for the explicit modelling of the water storage and transfer and links the different observations (geophysics, water level, evapotranspiration) with the water saturation. Unknown hydrological parameters (permeability, porosity) are retrieved from stochastic inversions. The scale of investigation of the different observations are discussed thank to the modelling results. A sensibility study of the measurements against the model is done and key hydro-geological processes of the site are presented.

  14. Pragmatic quality metrics for evolutionary software development models

    NASA Technical Reports Server (NTRS)

    Royce, Walker

    1990-01-01

    Due to the large number of product, project, and people parameters which impact large custom software development efforts, measurement of software product quality is a complex undertaking. Furthermore, the absolute perspective from which quality is measured (customer satisfaction) is intangible. While we probably can't say what the absolute quality of a software product is, we can determine the relative quality, the adequacy of this quality with respect to pragmatic considerations, and identify good and bad trends during development. While no two software engineers will ever agree on an optimum definition of software quality, they will agree that the most important perspective of software quality is its ease of change. We can call this flexibility, adaptability, or some other vague term, but the critical characteristic of software is that it is soft. The easier the product is to modify, the easier it is to achieve any other software quality perspective. This paper presents objective quality metrics derived from consistent lifecycle perspectives of rework which, when used in concert with an evolutionary development approach, can provide useful insight to produce better quality per unit cost/schedule or to achieve adequate quality more efficiently. The usefulness of these metrics is evaluated by applying them to a large, real world, Ada project.

  15. An Amphibious Magnetotelluric Investigation of the Cascadian Seismogenic and ETS zones.

    NASA Astrophysics Data System (ADS)

    Parris, B. A.; Livelybrooks, D.; Bedrosian, P.; Egbert, G. D.; Key, K.; Schultz, A.; Cook, A.; Kant, M.; Wogan, N.; Zeryck, A.

    2015-12-01

    The amphibious Magnetotelluric Observations of Cascadia using a Huge Array (MOCHA) experiment seeks to address unresolved questions about the seismogenic locked zone and down-dip transition zone where episodic tremor and slip (ETS) originates. The presence of free fluids is thought to be one of the primary controls on ETS behavior within the Cascadia margin. Since the bulk electrical conductivity in the crust and mantle can be greatly increased by fluids, magnetotelluric(MT) observations can offer unique insights on the fluid distribution and its relation to observed ETS behavior. Here we present preliminary results from the 146 MT stations collected for the MOCHA project. MOCHA is unique in that it is the first amphibious array of MT stations occupied to provide for 3-D interpretation of conductivity structure of a subduction zone. The MOCHA data set comprises 75 onshore stations and 71 offshore stations, accumulated over a two-year period, and located on an approximate 25km grid, spanning from the trench to the Eastern Willamette Valley, and from central Oregon into middle Washington. We present the results of a series of east-west (cross-strike) oriented, two-dimensional inversions created using the MARE2DEM software that provide an initial picture of the conductivity structure of the locked and ETS zones and its along strike variations. Our models can be used to identify correlations between ETS occurrence rates and inferred fluid concentrations. Our modeling explores the impact of various parameterizations on 2-D inversion results, including inclusion of a smoothness penalty reduction along the inferred slab interface. This series of 2-D inversions can then be used collectively to help make and guide an a priori 3-D inversion. In addition we will present a preliminary 3-D inversion of the onshore stations created using the ModEM software. We are currently working on modifying ModEM to support inversion of offshore data. The more computationally intensive 3-D

  16. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Wilson, Larry W.

    1989-01-01

    The longterm goal of this research is to identify or create a model for use in analyzing the reliability of flight control software. The immediate tasks addressed are the creation of data useful to the study of software reliability and production of results pertinent to software reliability through the analysis of existing reliability models and data. The completed data creation portion of this research consists of a Generic Checkout System (GCS) design document created in cooperation with NASA and Research Triangle Institute (RTI) experimenters. This will lead to design and code reviews with the resulting product being one of the versions used in the Terminal Descent Experiment being conducted by the Systems Validations Methods Branch (SVMB) of NASA/Langley. An appended paper details an investigation of the Jelinski-Moranda and Geometric models for software reliability. The models were given data from a process that they have correctly simulated and asked to make predictions about the reliability of that process. It was found that either model will usually fail to make good predictions. These problems were attributed to randomness in the data and replication of data was recommended.

  17. The control of float zone interfaces by the use of selected boundary conditions

    NASA Technical Reports Server (NTRS)

    Foster, L. M.; Mcintosh, J.

    1983-01-01

    The main goal of the float zone crystal growth project of NASA's Materials Processing in Space Program is to thoroughly understand the molten zone/freezing crystal system and all the mechanisms that govern this system. The surface boundary conditions required to give flat float zone solid melt interfaces were studied and computed. The results provide float zone furnace designers with better methods for controlling solid melt interface shapes and for computing thermal profiles and gradients. Documentation and a user's guide were provided for the computer software.

  18. Slab2 - Updated Subduction Zone Geometries and Modeling Tools

    NASA Astrophysics Data System (ADS)

    Moore, G.; Hayes, G. P.; Portner, D. E.; Furtney, M.; Flamme, H. E.; Hearne, M. G.

    2017-12-01

    The U.S. Geological Survey database of global subduction zone geometries (Slab1.0), is a highly utilized dataset that has been applied to a wide range of geophysical problems. In 2017, these models have been improved and expanded upon as part of the Slab2 modeling effort. With a new data driven approach that can be applied to a broader range of tectonic settings and geophysical data sets, we have generated a model set that will serve as a more comprehensive, reliable, and reproducible resource for three-dimensional slab geometries at all of the world's convergent margins. The newly developed framework of Slab2 is guided by: (1) a large integrated dataset, consisting of a variety of geophysical sources (e.g., earthquake hypocenters, moment tensors, active-source seismic survey images of the shallow slab, tomography models, receiver functions, bathymetry, trench ages, and sediment thickness information); (2) a dynamic filtering scheme aimed at constraining incorporated seismicity to only slab related events; (3) a 3-D data interpolation approach which captures both high resolution shallow geometries and instances of slab rollback and overlap at depth; and (4) an algorithm which incorporates uncertainties of contributing datasets to identify the most probable surface depth over the extent of each subduction zone. Further layers will also be added to the base geometry dataset, such as historic moment release, earthquake tectonic providence, and interface coupling. Along with access to several queryable data formats, all components have been wrapped into an open source library in Python, such that suites of updated models can be released as further data becomes available. This presentation will discuss the extent of Slab2 development, as well as the current availability of the model and modeling tools.

  19. Resource utilization during software development

    NASA Technical Reports Server (NTRS)

    Zelkowitz, Marvin V.

    1988-01-01

    This paper discusses resource utilization over the life cycle of software development and discusses the role that the current 'waterfall' model plays in the actual software life cycle. Software production in the NASA environment was analyzed to measure these differences. The data from 13 different projects were collected by the Software Engineering Laboratory at NASA Goddard Space Flight Center and analyzed for similarities and differences. The results indicate that the waterfall model is not very realistic in practice, and that as technology introduces further perturbations to this model with concepts like executable specifications, rapid prototyping, and wide-spectrum languages, we need to modify our model of this process.

  20. Common Practices from Two Decades of Water Resources Modelling Published in Environmental Modelling & Software: 1997 to 2016

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Peterson, M.; Larsen, J.

    2016-12-01

    A steady flow of manuscripts describing integrated water resources management (IWRM) modelling has been published in Environmental Modelling & Software since the journal's inaugural issue in 1997. These papers represent two decades of peer-reviewed scientific knowledge regarding methods, practices, and protocols for conducting IWRM. We have undertaken to explore this specific assemblage of literature with the intention of identifying commonly reported procedures in terms of data integration methods, modelling techniques, approaches to stakeholder participation, means of communication of model results, and other elements of the model development and application life cycle. Initial results from this effort will be presented including a summary of commonly used practices, and their evolution over the past two decades. We anticipate that results will show a pattern of movement toward greater use of both stakeholder/participatory modelling methods as well as increased use of automated methods for data integration and model preparation. Interestingly, such results could be interpreted to show that the availability of better, faster, and more integrated software tools and technologies free the modeler to take a less technocratic and more human approach to water resources modelling.

  1. Assessment of vulnerability zones for ground water pollution using GIS-DRASTIC-EC model: A field-based approach

    NASA Astrophysics Data System (ADS)

    Anantha Rao, D.; Naik, Pradeep K.; Jain, Sunil K.; Vinod Kumar, K.; Dhanamjaya Rao, E. N.

    2018-06-01

    Assessment of groundwater vulnerability to pollution is an essential pre-requisite for better planning of an area. We present the groundwater vulnerability assessment in parts of the Yamuna Nagar District, Haryana State, India in an area of about 800 km2, considered to be a freshwater zone in the foothills of the Siwalik Hill Ranges. Such areas in the Lower Himalayas form good groundwater recharge zones, and should always be free from contamination. But, the administration has been trying to promote industrialization along these foothill zones without actually assessing the environmental consequences such activities may invite in the future. GIS-DRASTIC model has been used with field based data inputs for studying the vulnerability assessment. But, we find that inclusion electrical conductivity (EC) as a model parameter makes it more robust. Therefore, we rename it as GIS-DRASTIC-EC model. The model identifies three vulnerability zones such as low, moderate and high with an areal extent of 5%, 80% and 15%, respectively. On the basis of major chemical parameters alone, the groundwater in the foothill zones apparently looks safe, but analysis with the help of GIS-DRASTIC-EC model gives a better perspective of the groundwater quality in terms of identifying the vulnerable areas.

  2. Numerical modeling of continental lithospheric weak zone over plume

    NASA Astrophysics Data System (ADS)

    Perepechko, Y. V.; Sorokin, K. E.

    2011-12-01

    The work is devoted to the development of magmatic systems in the continental lithosphere over diffluent mantle plumes. The areas of tension originating over them are accompanied by appearance of fault zones, and the formation of permeable channels, which are distributed magmatic melts. The numerical simulation of the dynamics of deformation fields in the lithosphere due to convection currents in the upper mantle, and the formation of weakened zones that extend up to the upper crust and create the necessary conditions for the formation of intermediate magma chambers has been carried out. Thermodynamically consistent non-isothermal model simulates the processes of heat and mass transfer of a wide class of magmatic systems, as well as the process of strain localization in the lithosphere and their influence on the formation of high permeability zones in the lower crust. The substance of the lithosphere is a rheologic heterophase medium, which is described by a two-velocity hydrodynamics. This makes it possible to take into account the process of penetration of the melt from the asthenosphere into the weakened zone. The energy dissipation occurs mainly due to interfacial friction and inelastic relaxation of shear stresses. The results of calculation reveal a nonlinear process of the formation of porous channels and demonstrate the diversity of emerging dissipative structures which are determined by properties of both heterogeneous lithosphere and overlying crust. Mutual effect of a permeable channel and the corresponding filtration process of the melt on the mantle convection and the dynamics of the asthenosphere have been studied. The formation of dissipative structures in heterogeneous lithosphere above mantle plumes occurs in accordance with the following scenario: initially, the elastic behavior of heterophase lithosphere leads to the formation of the narrow weakened zone, though sufficiently extensive, with higher porosity. Further, the increase in the width of

  3. FacetModeller: Software for manual creation, manipulation and analysis of 3D surface-based models

    NASA Astrophysics Data System (ADS)

    Lelièvre, Peter G.; Carter-McAuslan, Angela E.; Dunham, Michael W.; Jones, Drew J.; Nalepa, Mariella; Squires, Chelsea L.; Tycholiz, Cassandra J.; Vallée, Marc A.; Farquharson, Colin G.

    2018-01-01

    The creation of 3D models is commonplace in many disciplines. Models are often built from a collection of tessellated surfaces. To apply numerical methods to such models it is often necessary to generate a mesh of space-filling elements that conforms to the model surfaces. While there are meshing algorithms that can do so, they place restrictive requirements on the surface-based models that are rarely met by existing 3D model building software. Hence, we have developed a Java application named FacetModeller, designed for efficient manual creation, modification and analysis of 3D surface-based models destined for use in numerical modelling.

  4. Executable Behavioral Modeling of System and Software Architecture Specifications to Inform Resourcing Decisions

    DTIC Science & Technology

    2016-09-01

    BEHAVIORAL MODELING OF SYSTEM- AND SOFTWARE- ARCHITECTURE SPECIFICATIONS TO INFORM RESOURCING DECISIONS by Monica F. Farah-Stapleton...AND SOFTWARE- ARCHITECTURE SPECIFICATIONS TO INFORM RESOURCING DECISIONS 5. FUNDING NUMBERS 6. AUTHOR(S) Monica F. Farah-Stapleton 7. PERFORMING...this thesis are those of the author and do not reflect the official policy or position of the Department of Defense or the U.S. Government. IRB number

  5. Software Design Description for the Navy Coastal Ocean Model (NCOM) Version 4.0

    DTIC Science & Technology

    2008-12-31

    Naval Research Laboratory Stennis Space Center, MS 39529-5004 NRL/MR/7320--08-9149 Approved for public release; distribution is unlimited. Software ...suggestions for reducing this burden to Department of Defense, Washington Headquarters Services , Directorate for Information Operations and Reports (0704...LIMITATION OF ABSTRACT Software Design Description for the Navy Coastal Ocean Model (NCOM) Version 4.0 Paul Martin, Charlie N. Barron, Lucy F

  6. Investing in Software Sustainment

    DTIC Science & Technology

    2015-04-30

    colored arrows simply represent a reinforcing  loop called the “ Bandwagon   Effect ”.  This  effect   simply means that a series of successful missions will...the Software Engineering Institute (SEI) developed a simulation model for analyzing the effects of changes in demand for software sustainment and the...developed a simulation model for analyzing the effects of changes in demand for software sustainment and the corresponding funding decisions. The model

  7. System and Software Reliability (C103)

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Within the last decade better reliability models (hardware. software, system) than those currently used have been theorized and developed but not implemented in practice. Previous research on software reliability has shown that while some existing software reliability models are practical, they are no accurate enough. New paradigms of development (e.g. OO) have appeared and associated reliability models have been proposed posed but not investigated. Hardware models have been extensively investigated but not integrated into a system framework. System reliability modeling is the weakest of the three. NASA engineers need better methods and tools to demonstrate that the products meet NASA requirements for reliability measurement. For the new models for the software component of the last decade, there is a great need to bring them into a form that they can be used on software intensive systems. The Statistical Modeling and Estimation of Reliability Functions for Systems (SMERFS'3) tool is an existing vehicle that may be used to incorporate these new modeling advances. Adapting some existing software reliability modeling changes to accommodate major changes in software development technology may also show substantial improvement in prediction accuracy. With some additional research, the next step is to identify and investigate system reliability. System reliability models could then be incorporated in a tool such as SMERFS'3. This tool with better models would greatly add value in assess in GSFC projects.

  8. Double-Sided Wedge Model For Retreating Subduction Zones: Applications to the Apenninic and Hellenic Subduction Zones (Invited)

    NASA Astrophysics Data System (ADS)

    Brandon, M. T.; Willett, S.; Rahl, J. M.; Cowan, D. S.

    2009-12-01

    We propose a new model for the evolution of accreting wedges at retreating subduction zones. Advance and retreat refer to the polarity of the velocity of the overriding plate with respect to subduction zone. Advance indicates a velocity toward the subduction zone (e.g., Andes) and retreat, away from the subduction zone (e.g. Apennines, Crete). The tectonic mode of a subduction zone, whether advancing or retreating, is a result of both the rollback of the subducting plate and the absolute motion of the overriding plate. The Hellenic and Apenninic wedges are both associated with retreating subduction zones. The Hellenic wedge has been active for about 100 Ma, whereas the Apenninic wedge has been active for about 30 Ma. Comparison of maximum metamorphic pressures for exhumed rocks in these wedges (25 and 30 km, respectively) with the maximum thickness of the wedges at present (30 and 35 km, respectively) indicates that each wedge has maintained a relatively steady size during its evolution. This conclusion is based on the constraint that both frictional and viscous wedges are subject to the constraint of a steady wedge taper, so that thickness and width are strongly correlated. Both wedges show clear evidence of steady accretion during their full evolution, with accretionary fluxes of about 60 and 200 km2 Ma-1. These wedges also both show steady drift of material from the front to the rear of the wedge, with horizontal shortening dominating in the front of the wedge, and horizontal extension within the back of the wedge. We propose that these wedges represent two back-to-back wedges, with a convergent wedge on the leading side (proside), and a divergent wedge on the trailing side (retroside). In this sense, the wedges are bound by two plates. The subducting plate is familiar. It creates a thrust-sense traction beneath the proside of the wedge. The second plate is an “educting” plate, which is creates a normal-sense traction beneath the retroside of the wedge. The

  9. Preferential Flow Paths In A Karstified Spring Catchment: A Study Of Fault Zones As Conduits To Rapid Groundwater Flow

    NASA Astrophysics Data System (ADS)

    Kordilla, J.; Terrell, A. N.; Veltri, M.; Sauter, M.; Schmidt, S.

    2017-12-01

    In this study we model saturated and unsaturated flow in the karstified Weendespring catchment, located within the Leinetal graben in Goettingen, Germany. We employ the finite element COMSOL Multiphysics modeling software to model variably saturated flow using the Richards equation with a van Genuchten type parameterization. As part of the graben structure, the Weende spring catchment is intersected by seven fault zones along the main flow path of the 7400 m cross section of the catchment. As the Weende spring is part of the drinking water supply in Goettingen, it is particularly important to understand the vulnerability of the catchment and effect of fault zones on rapid transport of contaminants. Nitrate signals have been observed at the spring only a few days after the application of fertilizers within the catchment at a distance of approximately 2km. As the underlying layers are known to be highly impermeable, fault zones within the area are likely to create rapid flow paths to the water table and the spring. The model conceptualizes the catchment as containing three hydrogeological limestone units with varying degrees of karstification: the lower Muschelkalk limestone as a highly conductive layer, the middle Muschelkalk as an aquitard, and the upper Muschelkalk as another conductive layer. The fault zones are parameterized based on a combination of field data from quarries, remote sensing and literary data. The fault zone is modeled considering the fracture core as well as the surrounding damage zone with separate, specific hydraulic properties. The 2D conceptual model was implemented in COMSOL to study unsaturated flow at the catchment scale using van Genuchten parameters. The study demonstrates the importance of fault zones for preferential flow within the catchment and its effect on the spatial distribution of vulnerability.

  10. Evolution of the conceptual model of unsaturated zone hydrology at Yucca Mountain, Nevada

    NASA Astrophysics Data System (ADS)

    Flint, Alan L.; Flint, Lorraine E.; Bodvarsson, Gudmundur S.; Kwicklis, Edward M.; Fabryka-Martin, June

    2001-06-01

    Yucca Mountain is an arid site proposed for consideration as the United States' first underground high-level radioactive waste repository. Low rainfall (approximately 170 mm/yr) and a thick unsaturated zone (500-1000 m) are important physical attributes of the site because the quantity of water likely to reach the waste and the paths and rates of movement of the water to the saturated zone under future climates would be major factors in controlling the concentrations and times of arrival of radionuclides at the surrounding accessible environment. The framework for understanding the hydrologic processes that occur at this site and that control how quickly water will penetrate through the unsaturated zone to the water table has evolved during the past 15 yr. Early conceptual models assumed that very small volumes of water infiltrated into the bedrock (0.5-4.5 mm/yr, or 2-3 percent of rainfall), that much of the infiltrated water flowed laterally within the upper nonwelded units because of capillary barrier effects, and that the remaining water flowed down faults with a small amount flowing through the matrix of the lower welded, fractured rocks. It was believed that the matrix had to be saturated for fractures to flow. However, accumulating evidence indicated that infiltration rates were higher than initially estimated, such as infiltration modeling based on neutron borehole data, bomb-pulse isotopes deep in the mountain, perched water analyses and thermal analyses. Mechanisms supporting lateral diversion did not apply at these higher fluxes, and the flux calculated in the lower welded unit exceeded the conductivity of the matrix, implying vertical flow of water in the high permeability fractures of the potential repository host rock, and disequilibrium between matrix and fracture water potentials. The development of numerical modeling methods and parameter values evolved concurrently with the conceptual model in order to account for the observed field data

  11. Evolution of the conceptual model of unsaturated zone hydrology at Yucca Mountain, Nevada

    USGS Publications Warehouse

    Flint, Alan L.; Flint, Lorraine E.; Bodvarsson, Gudmundur S.; Kwicklis, Edward M.; Fabryka-Martin, June

    2001-01-01

    Yucca Mountain is an arid site proposed for consideration as the United States’ first underground high-level radioactive waste repository. Low rainfall (approximately 170 mm/yr) and a thick unsaturated zone (500–1000 m) are important physical attributes of the site because the quantity of water likely to reach the waste and the paths and rates of movement of the water to the saturated zone under future climates would be major factors in controlling the concentrations and times of arrival of radionuclides at the surrounding accessible environment. The framework for understanding the hydrologic processes that occur at this site and that control how quickly water will penetrate through the unsaturated zone to the water table has evolved during the past 15 yr. Early conceptual models assumed that very small volumes of water infiltrated into the bedrock (0.5–4.5 mm/yr, or 2–3 percent of rainfall), that much of the infiltrated water flowed laterally within the upper nonwelded units because of capillary barrier effects, and that the remaining water flowed down faults with a small amount flowing through the matrix of the lower welded, fractured rocks. It was believed that the matrix had to be saturated for fractures to flow. However, accumulating evidence indicated that infiltration rates were higher than initially estimated, such as infiltration modeling based on neutron borehole data, bomb-pulse isotopes deep in the mountain, perched water analyses and thermal analyses. Mechanisms supporting lateral diversion did not apply at these higher fluxes, and the flux calculated in the lower welded unit exceeded the conductivity of the matrix, implying vertical flow of water in the high permeability fractures of the potential repository host rock, and disequilibrium between matrix and fracture water potentials. The development of numerical modeling methods and parameter values evolved concurrently with the conceptual model in order to account for the observed field data

  12. 76 FR 28819 - NUREG/CR-XXXX, Development of Quantitative Software Reliability Models for Digital Protection...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-18

    ... NUCLEAR REGULATORY COMMISSION [NRC-2011-0109] NUREG/CR-XXXX, Development of Quantitative Software..., ``Development of Quantitative Software Reliability Models for Digital Protection Systems of Nuclear Power Plants... of Risk Analysis, Office of Nuclear Regulatory Research, U.S. Nuclear Regulatory Commission...

  13. Model of the Streamer Zone of a Leader

    NASA Astrophysics Data System (ADS)

    Milikh, G. M.; Raina, A.; Shneider, M.; Likhanskii, A.; George, A.

    2015-12-01

    Developed leaders represent highly conductive plasma channels, continuously emitting a fan of streamers, termed the streamer zone. The tip moves at a speed much slower than that of individual streamers. A huge number of short-lived streamers in the corona generate the space charge field required to maintain the streamer propagation. A critical issue is the conversion from the streamer to leader phase [Da Silva and Pasko, 2013]. The objective of this paper is to present simulations of the formation and propagation of the streamer zone of a leader. In these simulations we generated a group of streamers that propagate in a discharge gap while they interact with each other. We use the modified numerical model [Likhanskii et al., 2007] developed to simulate discharge plasma actuators driven by nanosecond pulses. The model uses 2D rectangular computational box, and the discharge gap is filled with the air at normal conditions. Furthermore the model considers electrons, positive and negative ions. The plasma kinetics and interaction with neutral molecules is modeled in a drift-diffusion approximation [Likhanskii et al., 2007]. The electric field and potential are related to the density of charged species according to the Poisson equation. The latter was solved by the successive over-relaxation method. It is shown that interaction between the streamers significantly reduces their propagation velocity. Furthermore the streamer velocity depends on the distance between the streamers. The smaller is that distance the stronger is the suppression of the streamer velocity. This explains why the leader, which consists of many streamers, is much slower than a single streamer formed in the same discharge gap. C.L. Da Silva and V.P. Pasko, J. Geophys. Res.: Atmospheres, 118, 1-30, 2013 A.V. Likhanskii et al., Phys. Plasmas, 14, 073501, 2007.

  14. Software engineering and Ada (Trademark) training: An implementation model for NASA

    NASA Technical Reports Server (NTRS)

    Legrand, Sue; Freedman, Glenn

    1988-01-01

    The choice of Ada for software engineering for projects such as the Space Station has resulted in government and industrial groups considering training programs that help workers become familiar with both a software culture and the intricacies of a new computer language. The questions of how much time it takes to learn software engineering with Ada, how much an organization should invest in such training, and how the training should be structured are considered. Software engineering is an emerging, dynamic discipline. It is defined by the author as the establishment and application of sound engineering environments, tools, methods, models, principles, and concepts combined with appropriate standards, guidelines, and practices to support computing which is correct, modifiable, reliable and safe, efficient, and understandable throughout the life cycle of the application. Neither the training programs needed, nor the content of such programs, have been well established. This study addresses the requirements for training for NASA personnel and recommends an implementation plan. A curriculum and a means of delivery are recommended. It is further suggested that a knowledgeable programmer may be able to learn Ada in 5 days, but that it takes 6 to 9 months to evolve into a software engineer who uses the language correctly and effectively. The curriculum and implementation plan can be adapted for each NASA Center according to the needs dictated by each project.

  15. MPS Solidification Model. Volume 2: Operating guide and software documentation for the unsteady model

    NASA Technical Reports Server (NTRS)

    Maples, A. L.

    1981-01-01

    The operation of solidification Model 2 is described and documentation of the software associated with the model is provided. Model 2 calculates the macrosegregation in a rectangular ingot of a binary alloy as a result of unsteady horizontal axisymmetric bidirectional solidification. The solidification program allows interactive modification of calculation parameters as well as selection of graphical and tabular output. In batch mode, parameter values are input in card image form and output consists of printed tables of solidification functions. The operational aspects of Model 2 that differ substantially from Model 1 are described. The global flow diagrams and data structures of Model 2 are included. The primary program documentation is the code itself.

  16. Science and Software

    NASA Astrophysics Data System (ADS)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site

  17. A Comparison of Four Software Programs for Implementing Decision Analytic Cost-Effectiveness Models.

    PubMed

    Hollman, Chase; Paulden, Mike; Pechlivanoglou, Petros; McCabe, Christopher

    2017-08-01

    The volume and technical complexity of both academic and commercial research using decision analytic modelling has increased rapidly over the last two decades. The range of software programs used for their implementation has also increased, but it remains true that a small number of programs account for the vast majority of cost-effectiveness modelling work. We report a comparison of four software programs: TreeAge Pro, Microsoft Excel, R and MATLAB. Our focus is on software commonly used for building Markov models and decision trees to conduct cohort simulations, given their predominance in the published literature around cost-effectiveness modelling. Our comparison uses three qualitative criteria as proposed by Eddy et al.: "transparency and validation", "learning curve" and "capability". In addition, we introduce the quantitative criterion of processing speed. We also consider the cost of each program to academic users and commercial users. We rank the programs based on each of these criteria. We find that, whilst Microsoft Excel and TreeAge Pro are good programs for educational purposes and for producing the types of analyses typically required by health technology assessment agencies, the efficiency and transparency advantages of programming languages such as MATLAB and R become increasingly valuable when more complex analyses are required.

  18. Histological Characterization of the Irritative Zones in Focal Cortical Dysplasia Using a Preclinical Rat Model.

    PubMed

    Deshmukh, Abhay; Leichner, Jared; Bae, Jihye; Song, Yinchen; Valdés-Hernández, Pedro A; Lin, Wei-Chiang; Riera, Jorge J

    2018-01-01

    Current clinical practice in focal epilepsy involves brain source imaging (BSI) to localize brain areas where from interictal epileptiform discharges (IEDs) emerge. These areas, named irritative zones , have been useful to define candidate seizures-onset zones during pre-surgical workup. Since human histological data are mostly available from final resected zones, systematic studies characterizing pathophysiological mechanisms and abnormal molecular/cellular substrates in irritative zones-independent of them being epileptogenic-are challenging. Combining BSI and histological analysis from all types of irritative zones is only possible through the use of preclinical animal models. Here, we recorded 32-channel spontaneous electroencephalographic data from rats that have focal cortical dysplasia (FCD) and chronic seizures. BSI for different IED subtypes was performed using the methodology presented in Bae et al. (2015). Post-mortem brain sections containing irritative zones were stained to quantify anatomical, functional, and inflammatory biomarkers specific for epileptogenesis, and the results were compared with those obtained using the contralateral healthy brain tissue. We found abnormal anatomical structures in all irritative zones (i.e., larger neuronal processes, glioreactivity, and vascular cuffing) and larger expressions for neurotransmission (i.e., NR2B) and inflammation (i.e., ILβ1, TNFα and HMGB1). We conclude that irritative zones in this rat preclinical model of FCD comprise abnormal tissues disregarding whether they are actually involved in icto-genesis or not. We hypothesize that seizure perpetuation happens gradually; hence, our results could support the use of IED-based BSI for the early diagnosis and preventive treatment of potential epileptic foci. Further verifications in humans are yet needed.

  19. Optimised layout and roadway support planning with integrated intelligent software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kouniali, S.; Josien, J.P.; Piguet, J.P.

    1996-12-01

    Experience with knowledge-based systems for Layout planning and roadway support dimensioning is on hand in European coal mining since 1985. The systems SOUT (Support choice and dimensioning, 1989), SOUT 2, PLANANK (planning of bolt-support), Exos (layout planning diagnosis. 1994), Sout 3 (1995) have been developed in close cooperation by CdF{sup 1}. INERIS{sup 2} , EMN{sup 3} (France) and RAG{sup 4}, DMT{sup 5}, TH - Aachen{sup 6} (Germany); ISLSP (Integrated Software for Layout and support planning) development is in progress (completion scheduled for July 1996). This new software technology in combination with conventional programming systems, numerical models and existing databases turnedmore » out to be suited for setting-up an intelligent decision aid for layout and roadway support planning. The system enhances reliability of planning and optimises the safety-to-cost ratio for (1) deformation forecast for roadways in seam and surrounding rocks, consideration of the general position of the roadway in the rock mass (zones of increased pressure, position of operating and mined panels); (2) support dimensioning; (3) yielding arches, rigid arches, porch sets, rigid rings, yielding rings and bolting/shotcreting for drifts; (4) yielding arches, rigid arches and porch sets for roadways in seam; and (5) bolt support for gateroads (assessment of exclusion criteria and calculation of the bolting pattern) bolting of face-end zones (feasibility and safety assessment; stability guarantee).« less

  20. Integrating Remote Sensing with Species Distribution Models; Mapping Tamarisk Invasions Using the Software for Assisted Habitat Modeling (SAHM).

    PubMed

    West, Amanda M; Evangelista, Paul H; Jarnevich, Catherine S; Young, Nicholas E; Stohlgren, Thomas J; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-10-11

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  1. Integrating remote sensing with species distribution models; Mapping tamarisk invasions using the Software for Assisted Habitat Modeling (SAHM)

    USGS Publications Warehouse

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Young, Nicholas E.; Stohlgren, Thomas J.; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-01-01

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  2. Stress drop inferred from dynamic rupture simulations consistent with Moment-Rupture area empirical scaling models: Effects of week shallow zone

    NASA Astrophysics Data System (ADS)

    Dalguer, L. A.; Miyake, H.; Irikura, K.; Wu, H., Sr.

    2016-12-01

    Empirical scaling models of seismic moment and rupture area provide constraints to parameterize source parameters, such as stress drop, for numerical simulations of ground motion. There are several scaling models published in the literature. The effect of the finite width seismogenic zone and the free-surface have been attributed to cause the breaking of the well know self-similar scaling (e.g. Dalguer et al, 2008) given origin to the so called L and W models for large faults. These models imply the existence of three-stage scaling relationship between seismic moment and rupture area (e.g. Irikura and Miyake, 2011). In this paper we extend the work done by Dalguer et al 2008, in which these authors calibrated fault models that match the observations showing that the average stress drop is independent of earthquake size for buried earthquakes, but scale dependent for surface-rupturing earthquakes. Here we have developed additional sets of dynamic rupture models for vertical strike slip faults to evaluate the effect of the weak shallow layer (WSL) zone for the calibration of stress drop. Rupture in the WSL zone is expected to operate with enhanced energy absorption mechanism. The set of dynamic models consists of fault models with width 20km and fault length L=20km, 40km, 60km, 80km, 100km, 120km, 200km, 300km and 400km and average stress drop values of 2.0MPa, 2.5MPa, 3.0MPa, 3.5MPa, 5.0MPa and 7.5MPa. For models that break the free-surface, the WSL zone is modeled assuming a 2km width with stress drop 0.0MPa or -2.0 MPa. Our results show that depending on the characterization of the WSL zone, the average stress drop at the seismogenic zone that fit the empirical models changes. If WSL zone is not considered, that is, stress drop at SL zone is the same as the seismogenic zone, average stress drop is about 20% smaller than models with WSL zone. By introducing more energy absorption at the SL zone, that could be the case of large mature faults, the average stress drop

  3. Adapting Better Interpolation Methods to Model Amphibious MT Data Along the Cascadian Subduction Zone.

    NASA Astrophysics Data System (ADS)

    Parris, B. A.; Egbert, G. D.; Key, K.; Livelybrooks, D.

    2016-12-01

    Magnetotellurics (MT) is an electromagnetic technique used to model the inner Earth's electrical conductivity structure. MT data can be analyzed using iterative, linearized inversion techniques to generate models imaging, in particular, conductive partial melts and aqueous fluids that play critical roles in subduction zone processes and volcanism. For example, the Magnetotelluric Observations of Cascadia using a Huge Array (MOCHA) experiment provides amphibious data useful for imaging subducted fluids from trench to mantle wedge corner. When using MOD3DEM(Egbert et al. 2012), a finite difference inversion package, we have encountered problems inverting, particularly, sea floor stations due to the strong, nearby conductivity gradients. As a work-around, we have found that denser, finer model grids near the land-sea interface produce better inversions, as characterized by reduced data residuals. This is partly to be due to our ability to more accurately capture topography and bathymetry. We are experimenting with improved interpolation schemes that more accurately track EM fields across cell boundaries, with an eye to enhancing the accuracy of the simulated responses and, thus, inversion results. We are adapting how MOD3DEM interpolates EM fields in two ways. The first seeks to improve weighting functions for interpolants to better address current continuity across grid boundaries. Electric fields are interpolated using a tri-linear spline technique, where the eight nearest electrical field estimates are each given weights determined by the technique, a kind of weighted average. We are modifying these weights to include cross-boundary conductivity ratios to better model current continuity. We are also adapting some of the techniques discussed in Shantsev et al (2014) to enhance the accuracy of the interpolated fields calculated by our forward solver, as well as to better approximate the sensitivities passed to the software's Jacobian that are used to generate a new

  4. Highly-optimized TWSM software package for seismic diffraction modeling adapted for GPU-cluster

    NASA Astrophysics Data System (ADS)

    Zyatkov, Nikolay; Ayzenberg, Alena; Aizenberg, Arkady

    2015-04-01

    Oil producing companies concern to increase resolution capability of seismic data for complex oil-and-gas bearing deposits connected with salt domes, basalt traps, reefs, lenses, etc. Known methods of seismic wave theory define shape of hydrocarbon accumulation with nonsufficient resolution, since they do not account for multiple diffractions explicitly. We elaborate alternative seismic wave theory in terms of operators of propagation in layers and reflection-transmission at curved interfaces. Approximation of this theory is realized in the seismic frequency range as the Tip-Wave Superposition Method (TWSM). TWSM based on the operator theory allows to evaluate of wavefield in bounded domains/layers with geometrical shadow zones (in nature it can be: salt domes, basalt traps, reefs, lenses, etc.) accounting for so-called cascade diffraction. Cascade diffraction includes edge waves from sharp edges, creeping waves near concave parts of interfaces, waves of the whispering galleries near convex parts of interfaces, etc. The basic algorithm of TWSM package is based on multiplication of large-size matrices (make hundreds of terabytes in size). We use advanced information technologies for effective realization of numerical procedures of the TWSM. In particular, we actively use NVIDIA CUDA technology and GPU accelerators allowing to significantly improve the performance of the TWSM software package, that is important in using it for direct and inverse problems. The accuracy, stability and efficiency of the algorithm are justified by numerical examples with curved interfaces. TWSM package and its separate components can be used in different modeling tasks such as planning of acquisition systems, physical interpretation of laboratory modeling, modeling of individual waves of different types and in some inverse tasks such as imaging in case of laterally inhomogeneous overburden, AVO inversion.

  5. The Regional Hydrologic Extremes Assessment System: A software framework for hydrologic modeling and data assimilation

    PubMed Central

    Das, Narendra; Stampoulis, Dimitrios; Ines, Amor; Fisher, Joshua B.; Granger, Stephanie; Kawata, Jessie; Han, Eunjin; Behrangi, Ali

    2017-01-01

    The Regional Hydrologic Extremes Assessment System (RHEAS) is a prototype software framework for hydrologic modeling and data assimilation that automates the deployment of water resources nowcasting and forecasting applications. A spatially-enabled database is a key component of the software that can ingest a suite of satellite and model datasets while facilitating the interfacing with Geographic Information System (GIS) applications. The datasets ingested are obtained from numerous space-borne sensors and represent multiple components of the water cycle. The object-oriented design of the software allows for modularity and extensibility, showcased here with the coupling of the core hydrologic model with a crop growth model. RHEAS can exploit multi-threading to scale with increasing number of processors, while the database allows delivery of data products and associated uncertainty through a variety of GIS platforms. A set of three example implementations of RHEAS in the United States and Kenya are described to demonstrate the different features of the system in real-world applications. PMID:28545077

  6. The Regional Hydrologic Extremes Assessment System: A software framework for hydrologic modeling and data assimilation.

    PubMed

    Andreadis, Konstantinos M; Das, Narendra; Stampoulis, Dimitrios; Ines, Amor; Fisher, Joshua B; Granger, Stephanie; Kawata, Jessie; Han, Eunjin; Behrangi, Ali

    2017-01-01

    The Regional Hydrologic Extremes Assessment System (RHEAS) is a prototype software framework for hydrologic modeling and data assimilation that automates the deployment of water resources nowcasting and forecasting applications. A spatially-enabled database is a key component of the software that can ingest a suite of satellite and model datasets while facilitating the interfacing with Geographic Information System (GIS) applications. The datasets ingested are obtained from numerous space-borne sensors and represent multiple components of the water cycle. The object-oriented design of the software allows for modularity and extensibility, showcased here with the coupling of the core hydrologic model with a crop growth model. RHEAS can exploit multi-threading to scale with increasing number of processors, while the database allows delivery of data products and associated uncertainty through a variety of GIS platforms. A set of three example implementations of RHEAS in the United States and Kenya are described to demonstrate the different features of the system in real-world applications.

  7. Software reliability perspectives

    NASA Technical Reports Server (NTRS)

    Wilson, Larry; Shen, Wenhui

    1987-01-01

    Software which is used in life critical functions must be known to be highly reliable before installation. This requires a strong testing program to estimate the reliability, since neither formal methods, software engineering nor fault tolerant methods can guarantee perfection. Prior to the final testing software goes through a debugging period and many models have been developed to try to estimate reliability from the debugging data. However, the existing models are poorly validated and often give poor performance. This paper emphasizes the fact that part of their failures can be attributed to the random nature of the debugging data given to these models as input, and it poses the problem of correcting this defect as an area of future research.

  8. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  9. A Software Development Simulation Model of a Spiral Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.

  10. Development of the quality assessment model of EHR software in family medicine practices: research based on user satisfaction.

    PubMed

    Kralj, Damir; Kern, Josipa; Tonkovic, Stanko; Koncar, Miroslav

    2015-09-09

    Family medicine practices (FMPs) make the basis for the Croatian health care system. Use of electronic health record (EHR) software is mandatory and it plays an important role in running these practices, but important functional features still remain uneven and largely left to the will of the software developers. The objective of this study was to develop a novel and comprehensive model for functional evaluation of the EHR software in FMPs, based on current world standards, models and projects, as well as on actual user satisfaction and requirements. Based on previous theoretical and experimental research in this area, we made the initial framework model consisting of six basic categories as a base for online survey questionnaire. Family doctors assessed perceived software quality by using a five-point Likert-type scale. Using exploratory factor analysis and appropriate statistical methods over the collected data, the final optimal structure of the novel model was formed. Special attention was focused on the validity and quality of the novel model. The online survey collected a total of 384 cases. The obtained results indicate both the quality of the assessed software and the quality in use of the novel model. The intense ergonomic orientation of the novel measurement model was particularly emphasised. The resulting novel model is multiple validated, comprehensive and universal. It could be used to assess the user-perceived quality of almost all forms of the ambulatory EHR software and therefore useful to all stakeholders in this area of the health care informatisation.

  11. Software for Data Analysis with Graphical Models

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.; Roy, H. Scott

    1994-01-01

    Probabilistic graphical models are being used widely in artificial intelligence and statistics, for instance, in diagnosis and expert systems, as a framework for representing and reasoning with probabilities and independencies. They come with corresponding algorithms for performing statistical inference. This offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper illustrates the framework with an example and then presents some basic techniques for the task: problem decomposition and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  12. Towards "realistic" fault zones in a 3D structure model of the Thuringian Basin, Germany

    NASA Astrophysics Data System (ADS)

    Kley, J.; Malz, A.; Donndorf, S.; Fischer, T.; Zehner, B.

    2012-04-01

    3D computer models of geological architecture are evolving into a standard tool for visualization and analysis. Such models typically comprise the bounding surfaces of stratigraphic layers and faults. Faults affect the continuity of aquifers and can themselves act as fluid conduits or barriers. This is one reason why a "realistic" representation of faults in 3D models is desirable. Still so, many existing models treat faults in a simplistic fashion, e.g. as vertical downward projections of fault traces observed at the surface. Besides being geologically and mechanically unreasonable, this also causes technical difficulties in the modelling workflow. Most natural faults are inclined and may change dips according to rock type or flatten into mechanically weak layers. Boreholes located close to a fault can therefore cross it at depth, resulting in stratigraphic control points allocated to the wrong block. Also, faults tend to split up into several branches, forming fault zones. Obtaining a more accurate representation of faults and fault zones is therefore challenging. We present work-in-progress from the Thuringian Basin in central Germany. The fault zone geometries are never fully constrained by data and must be extrapolated to depth. We use balancing of serial, parallel cross-sections to constrain subsurface extrapolations. The structure sections are checked for consistency by restoring them to an undeformed state. If this is possible without producing gaps or overlaps, the interpretation is considered valid (but not unique) for a single cross-section. Additional constraints are provided by comparison of adjacent cross-sections. Structures should change continuously from one section to another. Also, from the deformed and restored cross-sections we can measure the strain incurred during deformation. Strain should be compatible among the cross-sections: If at all, it should vary smoothly and systematically along a given fault zone. The stratigraphic contacts and

  13. Observation model and parameter partials for the JPL geodetic (GPS) modeling software 'GPSOMC'

    NASA Technical Reports Server (NTRS)

    Sovers, O. J.

    1990-01-01

    The physical models employed in GPSOMC, the modeling module of the GIPSY software system developed at JPL for analysis of geodetic Global Positioning Satellite (GPS) measurements are described. Details of the various contributions to range and phase observables are given, as well as the partial derivatives of the observed quantities with respect to model parameters. A glossary of parameters is provided to enable persons doing data analysis to identify quantities with their counterparts in the computer programs. The present version is the second revision of the original document which it supersedes. The modeling is expanded to provide the option of using Cartesian station coordinates; parameters for the time rates of change of universal time and polar motion are also introduced.

  14. Collected software engineering papers, volume 9

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1990 through October 1991. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the ninth such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. For the convenience of this presentation, the eight papers contained here are grouped into three major categories: (1) software models studies; (2) software measurement studies; and (3) Ada technology studies. The first category presents studies on reuse models, including a software reuse model applied to maintenance and a model for an organization to support software reuse. The second category includes experimental research methods and software measurement techniques. The third category presents object-oriented approaches using Ada and object-oriented features proposed for Ada. The SEL is actively working to understand and improve the software development process at GSFC.

  15. Reuseable Objects Software Environment (ROSE): Introduction to Air Force Software Reuse Workshop

    NASA Technical Reports Server (NTRS)

    Cottrell, William L.

    1994-01-01

    The Reusable Objects Software Environment (ROSE) is a common, consistent, consolidated implementation of software functionality using modern object oriented software engineering including designed-in reuse and adaptable requirements. ROSE is designed to minimize abstraction and reduce complexity. A planning model for the reverse engineering of selected objects through object oriented analysis is depicted. Dynamic and functional modeling are used to develop a system design, the object design, the language, and a database management system. The return on investment for a ROSE pilot program and timelines are charted.

  16. State Enterprise Zone Programs: Have They Worked?

    ERIC Educational Resources Information Center

    Peters, Alan H.; Fisher, Peter S.

    The effectiveness of state enterprise zone programs was examined by using a hypothetical-firm model called the Tax and Incentives Model-Enterprise Zones (TAIM-ez) model to analyze the value of enterprise zone incentives to businesses across the United States and especially in the 13 states that had substantial enterprise zone programs by 1990. The…

  17. Prototype software model for designing intruder detection systems with simulation

    NASA Astrophysics Data System (ADS)

    Smith, Jeffrey S.; Peters, Brett A.; Curry, James C.; Gupta, Dinesh

    1998-08-01

    This article explores using discrete-event simulation for the design and control of defence oriented fixed-sensor- based detection system in a facility housing items of significant interest to enemy forces. The key issues discussed include software development, simulation-based optimization within a modeling framework, and the expansion of the framework to create real-time control tools and training simulations. The software discussed in this article is a flexible simulation environment where the data for the simulation are stored in an external database and the simulation logic is being implemented using a commercial simulation package. The simulation assesses the overall security level of a building against various intruder scenarios. A series of simulation runs with different inputs can determine the change in security level with changes in the sensor configuration, building layout, and intruder/guard strategies. In addition, the simulation model developed for the design stage of the project can be modified to produce a control tool for the testing, training, and real-time control of systems with humans and sensor hardware in the loop.

  18. A numerical study of zone-melting process for the thermoelectric material of Bi2Te3

    NASA Astrophysics Data System (ADS)

    Chen, W. C.; Wu, Y. C.; Hwang, W. S.; Hsieh, H. L.; Huang, J. Y.; Huang, T. K.

    2015-06-01

    In this study, a numerical model has been established by employing a commercial software; ProCAST, to simulate the variation/distribution of temperature and the subsequent microstructure of Bi2Te3 fabricated by zone-melting technique. Then an experiment is conducted to measure the temperature variation/distribution during the zone-melting process to validate the numerical system. Also, the effects of processing parameters on crystallization microstructure such as moving speed and temperature of heater are numerically evaluated. In the experiment, the Bi2Te3 powder are filled into a 30mm diameter quartz cylinder and the heater is set to 800°C with a moving speed 12.5 mm/hr. A thermocouple is inserted in the Bi2Te3 powder to measure the temperature variation/distribution of the zone-melting process. The temperature variation/distribution measured by experiment is compared to the results of numerical simulation. The results show that our model and the experiment are well matched. Then the model is used to evaluate the crystal formation for Bi2Te3 with a 30mm diameter process. It's found that when the moving speed is slower than 17.5 mm/hr, columnar crystal is obtained. In the end, we use this model to predict the crystal formation of zone-melting process for Bi2Te3 with a 45 mm diameter. The results show that it is difficult to grow columnar crystal when the diameter comes to 45mm.

  19. Modelling groundwater seepage zones in an unconfined aquifer with MODFLOW: different approaches

    NASA Astrophysics Data System (ADS)

    Leterme, Bertrand; Gedeon, Matej

    2014-05-01

    In areas where groundwater level occurs close to surface topography, the discharge of groundwater flow to the ground surface (or seepage) can be an important aspect of catchment hydrological cycle. It is also associated with valuable zones from an ecological point of view, often having a permanent shallow water table and constant lithotrophic water quality (Batelaan et al., 2003). In the present study, we try to implement a correct representation of this seepage process in a MODFLOW-HYDRUS coupled model for a small catchment (18.6 km²) of north-east Belgium. We started from an exisiting transient groundwater model of the unconfined aquifer in the study area (Gedeon and Mallants, 2009) discretized in 50x50 m cells. As the model did not account for seepage, hydraulic heads were simulated above the surface topography in certain zones. In the coupled MODFLOW-HYDRUS setup, transient boundary conditions (potential evapotranspiration and precipitation) are used to calculate the recharge with the HYDRUS package (Seo et al., 2007) for MODFLOW-2000 (Harbaugh et al., 2000). Coupling HYDRUS to MODFLOW involves the definition of a number of zones based on similarity in estimated groundwater depth, soil type and land cover. Concerning simulation of seepage, several existing packages are tested, including the DRAIN package (as in Reeve et al., 2006), the SPF package (from VSF Process; Thoms et al., 2006) and the PBC package (Post, 2011). Alternatively to the HYDRUS package for MODFLOW, the UZF package (Niswonger et al., 2006) for the simulation of recharge (and seepage) is also tested. When applicable, the parameterization of drain conductance in the top layer is critical and is investigated in relation to the soil hydraulic conductivity values used for the unsaturated zone (HYDRUS). Furthermore, stability issues are discussed, and where successful model runs are obtained, simulation results are compared with observed groundwater levels from a piezometric network. Spatial and

  20. Combining Static Analysis and Model Checking for Software Analysis

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2003-01-01

    We present an iterative technique in which model checking and static analysis are combined to verify large software systems. The role of the static analysis is to compute partial order information which the model checker uses to reduce the state space. During exploration, the model checker also computes aliasing information that it gives to the static analyzer which can then refine its analysis. The result of this refined analysis is then fed back to the model checker which updates its partial order reduction. At each step of this iterative process, the static analysis computes optimistic information which results in an unsafe reduction of the state space. However we show that the process converges to a fired point at which time the partial order information is safe and the whole state space is explored.

  1. A CBI Model for the Design of CAI Software by Teachers/Nonprogrammers.

    ERIC Educational Resources Information Center

    Tessmer, Martin; Jonassen, David H.

    This paper describes a design model presented in workbook form which is intended to facilitate computer-assisted instruction (CAI) software design by teachers who do not have programming experience. Presentation of the model is preceded by a number of assumptions that underlie the instructional content and methods of the textbook. It is argued…

  2. Development of a zoning-based environmental-ecological-coupled model for lakes to assess lake restoration effect

    NASA Astrophysics Data System (ADS)

    Xu, Mengjia; Zou, Changxin; Zhao, Yanwei

    2017-04-01

    Environmental/ecological models are widely used for lake management as they provide a means to understand physical, chemical and biological processes in highly complex ecosystems. Most research focused on the development of environmental (water quality) and ecological models, separately. Limited studies were developed to couple the two models, and in these limited coupled models, a lake was regarded as a whole for analysis (i.e., considering the lake to be one well-mixed box), which was appropriate for small-scale lakes and was not sufficient to capture spatial variations within middle-scale or large-scale lakes. This paper seeks to establish a zoning-based environmental-ecological-coupled model for a lake. The Baiyangdian Lake, the largest freshwater lake in Northern China, was adopted as the study case. The coupled lake models including a hydrodynamics and water quality model established by MIKE21 and a compartmental ecological model used STELLA software have been established for middle-sized Baiyangdian Lake to realize the simulation of spatial variations of ecological conditions. On the basis of the flow field distribution results generated by MIKE21 hydrodynamics model, four water area zones were used as an example for compartmental ecological model calibration and validation. The results revealed that the developed coupled lake models can reasonably reflected the changes of the key state variables although there remain some state variables that are not well represented by the model due to the low quality of field monitoring data. Monitoring sites in a compartment may not be representative of the water quality and ecological conditions in the entire compartment even though that is the intention of compartment-based model design. There was only one ecological observation from a single monitoring site for some periods. This single-measurement issue may cause large discrepancies particularly when sampled site is not representative of the whole compartment. The

  3. Zones of impact around icebreakers affecting beluga whales in the Beaufort Sea.

    PubMed

    Erbe, C; Farmer, D M

    2000-09-01

    A software model estimating zones of impact on marine mammals around man-made noise [C. Erbe and D. M. Farmer, J. Acoust. Soc. Am. 108, 1327-1331 (2000)] is applied to the case of icebreakers affecting beluga whales in the Beaufort Sea. Two types of noise emitted by the Canadian Coast Guard icebreaker Henry Larsen are analyzed: bubbler system noise and propeller cavitation noise. Effects on beluga whales are modeled both in a deep-water environment and a near-shore environment. The model estimates that the Henry Larsen is audible to beluga whales over ranges of 35-78 km, depending on location. The zone of behavioral disturbance is only slightly smaller. Masking of beluga communication signals is predicted within 14-71-km range. Temporary hearing damage can occur if a beluga stays within 1-4 km of the Henry Larsen for at least 20 min. Bubbler noise impacts over the short ranges quoted; propeller cavitation noise accounts for all the long-range effects. Serious problems can arise in heavily industrialized areas where animals are exposed to ongoing noise and where anthropogenic noise from a variety of sources adds up.

  4. OpenFLUID: an open-source software environment for modelling fluxes in landscapes

    NASA Astrophysics Data System (ADS)

    Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc

    2013-04-01

    Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network

  5. A standardised graphic method for describing data privacy frameworks in primary care research using a flexible zone model.

    PubMed

    Kuchinke, Wolfgang; Ohmann, Christian; Verheij, Robert A; van Veen, Evert-Ben; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C

    2014-12-01

    To develop a model describing core concepts and principles of data flow, data privacy and confidentiality, in a simple and flexible way, using concise process descriptions and a diagrammatic notation applied to research workflow processes. The model should help to generate robust data privacy frameworks for research done with patient data. Based on an exploration of EU legal requirements for data protection and privacy, data access policies, and existing privacy frameworks of research projects, basic concepts and common processes were extracted, described and incorporated into a model with a formal graphical representation and a standardised notation. The Unified Modelling Language (UML) notation was enriched by workflow and own symbols to enable the representation of extended data flow requirements, data privacy and data security requirements, privacy enhancing techniques (PET) and to allow privacy threat analysis for research scenarios. Our model is built upon the concept of three privacy zones (Care Zone, Non-care Zone and Research Zone) containing databases, data transformation operators, such as data linkers and privacy filters. Using these model components, a risk gradient for moving data from a zone of high risk for patient identification to a zone of low risk can be described. The model was applied to the analysis of data flows in several general clinical research use cases and two research scenarios from the TRANSFoRm project (e.g., finding patients for clinical research and linkage of databases). The model was validated by representing research done with the NIVEL Primary Care Database in the Netherlands. The model allows analysis of data privacy and confidentiality issues for research with patient data in a structured way and provides a framework to specify a privacy compliant data flow, to communicate privacy requirements and to identify weak points for an adequate implementation of data privacy. Copyright © 2014 Elsevier Ireland Ltd. All rights

  6. Radiobiological modeling with MarCell software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasan, J.S.; Jones, T.D.

    1999-01-01

    A nonlinear system of differential equations that models the bone marrow cellular kinetics associated with radiation injury, molecular repair, and compensatory cell proliferation has been extensively documented. Recently, that model has been implemented as MarCell, a user-friendly MS-DOS computer program that allows users with little knowledge of the original model to evaluate complex radiation exposure scenarios. The software allows modeling with the following radiations: tritium beta, 100 kVp X, 250 kVp X, 22 MV X, {sup 60}Co, {sup 137}Cs, 2 MeV electrons, triga neutrons, D-T neutrons, and 3 blends of mixed-field fission radiations. The possible cell lineages are stem, stroma,more » and leukemia/lymphoma, and the available species include mouse, rat, dog, sheep, swine, burro, and man. An attractive mathematical feature is that any protracted protocol can be expressed as an equivalent prompt dose for either the source used or for a reference, such as 250 kVp X rays or {sup 60}Co. Output from MarCell includes: risk of 30-day mortality; risk of cancer and leukemia based either on cytopenia or compensatory cell proliferation; cell survival plots as a function of time or dose; and 4-week recovery kinetics following treatment. In this article, the program`s applicability and ease of use are demonstrated by evaluating a medical total body irradiation protocol and a nuclear fallout scenario.« less

  7. Design and Implementation of Scientific Software Components to Enable Multiscale Modeling: The Effective Fragment Potential (QM/EFP) Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaenko, Alexander; Windus, Theresa L.; Sosonkina, Masha

    2012-10-19

    The design and development of scientific software components to provide an interface to the effective fragment potential (EFP) methods are reported. Multiscale modeling of physical and chemical phenomena demands the merging of software packages developed by research groups in significantly different fields. Componentization offers an efficient way to realize new high performance scientific methods by combining the best models available in different software packages without a need for package readaptation after the initial componentization is complete. The EFP method is an efficient electronic structure theory based model potential that is suitable for predictive modeling of intermolecular interactions in large molecularmore » systems, such as liquids, proteins, atmospheric aerosols, and nanoparticles, with an accuracy that is comparable to that of correlated ab initio methods. The developed components make the EFP functionality accessible for any scientific component-aware software package. The performance of the component is demonstrated on a protein interaction model, and its accuracy is compared with results obtained with coupled cluster methods.« less

  8. Fault compaction and overpressured faults: results from a 3-D model of a ductile fault zone

    NASA Astrophysics Data System (ADS)

    Fitzenz, D. D.; Miller, S. A.

    2003-10-01

    A model of a ductile fault zone is incorporated into a forward 3-D earthquake model to better constrain fault-zone hydraulics. The conceptual framework of the model fault zone was chosen such that two distinct parts are recognized. The fault core, characterized by a relatively low permeability, is composed of a coseismic fault surface embedded in a visco-elastic volume that can creep and compact. The fault core is surrounded by, and mostly sealed from, a high permeability damaged zone. The model fault properties correspond explicitly to those of the coseismic fault core. Porosity and pore pressure evolve to account for the viscous compaction of the fault core, while stresses evolve in response to the applied tectonic loading and to shear creep of the fault itself. A small diffusive leakage is allowed in and out of the fault zone. Coseismically, porosity is created to account for frictional dilatancy. We show in the case of a 3-D fault model with no in-plane flow and constant fluid compressibility, pore pressures do not drop to hydrostatic levels after a seismic rupture, leading to an overpressured weak fault. Since pore pressure plays a key role in the fault behaviour, we investigate coseismic hydraulic property changes. In the full 3-D model, pore pressures vary instantaneously by the poroelastic effect during the propagation of the rupture. Once the stress state stabilizes, pore pressures are incrementally redistributed in the failed patch. We show that the significant effect of pressure-dependent fluid compressibility in the no in-plane flow case becomes a secondary effect when the other spatial dimensions are considered because in-plane flow with a near-lithostatically pressured neighbourhood equilibrates at a pressure much higher than hydrostatic levels, forming persistent high-pressure fluid compartments. If the observed faults are not all overpressured and weak, other mechanisms, not included in this model, must be at work in nature, which need to be

  9. Discrete shear-transformation-zone plasticity modeling of notched bars

    NASA Astrophysics Data System (ADS)

    Kondori, Babak; Amine Benzerga, A.; Needleman, Alan

    2018-02-01

    Plane strain tension analyses of un-notched and notched bars are carried out using discrete shear transformation zone plasticity. In this framework, the carriers of plastic deformation are shear transformation zones (STZs) which are modeled as Eshelby inclusions. Superposition is used to represent a boundary value problem solution in terms of discretely modeled Eshelby inclusions, given analytically for an infinite elastic medium, and an image solution that enforces the prescribed boundary conditions. The image problem is a standard linear elastic boundary value problem that is solved by the finite element method. Potential STZ activation sites are randomly distributed in the bars and constitutive relations are specified for their evolution. Results are presented for un-notched bars, for bars with blunt notches and for bars with sharp notches. The computed stress-strain curves are serrated with the magnitude of the associated stress-drops depending on bar size, notch acuity and STZ evolution. Cooperative deformation bands (shear bands) emerge upon straining and, in some cases, high stress levels occur within the bands. Effects of specimen geometry and size on the stress-strain curves are explored. Depending on STZ kinetics, notch strengthening, notch insensitivity or notch weakening are obtained. The analyses provide a rationale for some conflicting findings regarding notch effects on the mechanical response of metallic glasses.

  10. Groundwater movement simulation by the software package PM5 for the Sviyaga river adjoining territory in the Republic of Tatarstan

    NASA Astrophysics Data System (ADS)

    Kosterina, E. A.; Isagadzhieva, Z. Sh

    2018-01-01

    Data of the ecological-hydrogeological fieldwork at the Predvolzhye region of the Republic of Tatarstan were analyzed. A geofiltration model of the Buinsk region area near the village of Stary Studenets in the territory of the Republic of Tatarstan was constructed by the PM5 software package. The model can be developed to become the basis for estimation of the groundwater reserves of the territory, modeling the operation of water intake wells, designing the location of water intake wells, and evaluation of their operational capabilities, and constructing sanitary protection zones.

  11. Mechanical evolution of transpression zones affected by fault interactions: Insights from 3D elasto-plastic finite element models

    NASA Astrophysics Data System (ADS)

    Nabavi, Seyed Tohid; Alavi, Seyed Ahmad; Mohammadi, Soheil; Ghassemi, Mohammad Reza

    2018-01-01

    The mechanical evolution of transpression zones affected by fault interactions is investigated by a 3D elasto-plastic mechanical model solved with the finite-element method. Ductile transpression between non-rigid walls implies an upward and lateral extrusion. The model results demonstrate that a, transpression zone evolves in a 3D strain field along non-coaxial strain paths. Distributed plastic strain, slip transfer, and maximum plastic strain occur within the transpression zone. Outside the transpression zone, fault slip is reduced because deformation is accommodated by distributed plastic shear. With progressive deformation, the σ3 axis (the minimum compressive stress) rotates within the transpression zone to form an oblique angle to the regional transport direction (∼9°-10°). The magnitude of displacement increases faster within the transpression zone than outside it. Rotation of the displacement vectors of oblique convergence with time suggests that transpression zone evolves toward an overall non-plane strain deformation. The slip decreases along fault segments and with increasing depth. This can be attributed to the accommodation of bulk shortening over adjacent fault segments. The model result shows an almost symmetrical domal uplift due to off-fault deformation, generating a doubly plunging fold and a 'positive flower' structure. Outside the overlap zone, expanding asymmetric basins subside to 'negative flower' structures on both sides of the transpression zone and are called 'transpressional basins'. Deflection at fault segments causes the fault dip fall to less than 90° (∼86-89°) near the surface (∼1.5 km). This results in a pure-shear-dominated, triclinic, and discontinuous heterogeneous flow of the transpression zone.

  12. Predictive modelling of fault related fracturing in carbonate damage-zones: analytical and numerical models of field data (Central Apennines, Italy)

    NASA Astrophysics Data System (ADS)

    Mannino, Irene; Cianfarra, Paola; Salvini, Francesco

    2010-05-01

    Permeability in carbonates is strongly influenced by the presence of brittle deformation patterns, i.e pressure-solution surfaces, extensional fractures, and faults. Carbonate rocks achieve fracturing both during diagenesis and tectonic processes. Attitude, spatial distribution and connectivity of brittle deformation features rule the secondary permeability of carbonatic rocks and therefore the accumulation and the pathway of deep fluids (ground-water, hydrocarbon). This is particularly true in fault zones, where the damage zone and the fault core show different hydraulic properties from the pristine rock as well as between them. To improve the knowledge of fault architecture and faults hydraulic properties we study the brittle deformation patterns related to fault kinematics in carbonate successions. In particular we focussed on the damage-zone fracturing evolution. Fieldwork was performed in Meso-Cenozoic carbonate units of the Latium-Abruzzi Platform, Central Apennines, Italy. These units represent field analogues of rock reservoir in the Southern Apennines. We combine the study of rock physical characteristics of 22 faults and quantitative analyses of brittle deformation for the same faults, including bedding attitudes, fracturing type, attitudes, and spatial intensity distribution by using the dimension/spacing ratio, namely H/S ratio where H is the dimension of the fracture and S is the spacing between two analogous fractures of the same set. Statistical analyses of structural data (stereonets, contouring and H/S transect) were performed to infer a focussed, general algorithm that describes the expected intensity of fracturing process. The analytical model was fit to field measurements by a Montecarlo-convergent approach. This method proved a useful tool to quantify complex relations with a high number of variables. It creates a large sequence of possible solution parameters and results are compared with field data. For each item an error mean value is

  13. The development and application of composite complexity models and a relative complexity metric in a software maintenance environment

    NASA Technical Reports Server (NTRS)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  14. Model-based software for simulating ultrasonic pulse/echo inspections of metal components

    NASA Astrophysics Data System (ADS)

    Chiou, Chien-Ping; Margetan, Frank J.; Taylor, Jared L.; McKillip, Matthew; Engle, Brady J.; Roberts, Ronald A.; Barnard, Daniel J.

    2017-02-01

    Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at Iowa State University, an effort was initiated in 2015 to repackage existing research-grade software into user friendly tools for the rapid estimation of signal-to-noise ratio (S/N) for ultrasonic inspections of metals. The software combines: (1) a Python-based graphical user interface for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signals and backscattered grain noise characteristics. The later makes use the Thompson-Gray Model for the response from an internal defect and the Independent Scatterer Model for backscattered grain noise. This paper provides an overview of the ongoing modeling effort with emphasis on recent developments. These include: treatment of angle-beam inspections, implementation of distance-amplitude corrections, changes in the generation of "invented" calibration signals, efforts to simulate ultrasonic C-scans; and experimental testing of model predictions. The simulation software can now treat both normal and oblique-incidence immersion inspections of curved metal components having equiaxed microstructures in which the grain size varies with depth. Both longitudinal and shear-wave inspections are treated. The model transducer can either be planar, spherically-focused, or bi-cylindrically-focused. A calibration (or reference) signal is required and is used to deduce the measurement system efficiency function. This can be "invented" by the software using center frequency and bandwidth information specified by the user, or, alternatively, a measured calibration signal can be used. Defect types include flat-bottomed-hole reference reflectors, and spherical pores and inclusions. Simulation outputs include estimated defect signal amplitudes, root-mean-squared grain noise amplitudes, and S/N as functions of the depth of the defect within the metal component. At any particular

  15. Software for Generating Troposphere Corrections for InSAR Using GPS and Weather Model Data

    NASA Technical Reports Server (NTRS)

    Moore, Angelyn W.; Webb, Frank H.; Fishbein, Evan F.; Fielding, Eric J.; Owen, Susan E.; Granger, Stephanie L.; Bjoerndahl, Fredrik; Loefgren, Johan; Fang, Peng; Means, James D.; hide

    2013-01-01

    Atmospheric errors due to the troposphere are a limiting error source for spaceborne interferometric synthetic aperture radar (InSAR) imaging. This software generates tropospheric delay maps that can be used to correct atmospheric artifacts in InSAR data. The software automatically acquires all needed GPS (Global Positioning System), weather, and Digital Elevation Map data, and generates a tropospheric correction map using a novel algorithm for combining GPS and weather information while accounting for terrain. Existing JPL software was prototypical in nature, required a MATLAB license, required additional steps to acquire and ingest needed GPS and weather data, and did not account for topography in interpolation. Previous software did not achieve a level of automation suitable for integration in a Web portal. This software overcomes these issues. GPS estimates of tropospheric delay are a source of corrections that can be used to form correction maps to be applied to InSAR data, but the spacing of GPS stations is insufficient to remove short-wavelength tropospheric artifacts. This software combines interpolated GPS delay with weather model precipitable water vapor (PWV) and a digital elevation model to account for terrain, increasing the spatial resolution of the tropospheric correction maps and thus removing short wavelength tropospheric artifacts to a greater extent. It will be integrated into a Web portal request system, allowing use in a future L-band SAR Earth radar mission data system. This will be a significant contribution to its technology readiness, building on existing investments in in situ space geodetic networks, and improving timeliness, quality, and science value of the collected data

  16. pyLIMA: An Open-source Package for Microlensing Modeling. I. Presentation of the Software and Analysis of Single-lens Models

    NASA Astrophysics Data System (ADS)

    Bachelet, E.; Norbury, M.; Bozza, V.; Street, R.

    2017-11-01

    Microlensing is a unique tool, capable of detecting the “cold” planets between ˜1 and 10 au from their host stars and even unbound “free-floating” planets. This regime has been poorly sampled to date owing to the limitations of alternative planet-finding methods, but a watershed in discoveries is anticipated in the near future thanks to the planned microlensing surveys of WFIRST-AFTA and Euclid's Extended Mission. Of the many challenges inherent in these missions, the modeling of microlensing events will be of primary importance, yet it is often time-consuming, complex, and perceived as a daunting barrier to participation in the field. The large scale of future survey data products will require thorough but efficient modeling software, but, unlike other areas of exoplanet research, microlensing currently lacks a publicly available, well-documented package to conduct this type of analysis. We present version 1.0 of the python Lightcurve Identification and Microlensing Analysis (pyLIMA). This software is written in Python and uses existing packages as much as possible to make it widely accessible. In this paper, we describe the overall architecture of the software and the core modules for modeling single-lens events. To verify the performance of this software, we use it to model both real data sets from events published in the literature and generated test data produced using pyLIMA's simulation module. The results demonstrate that pyLIMA is an efficient tool for microlensing modeling. We will expand pyLIMA to consider more complex phenomena in the following papers.

  17. Sanitary protection zoning based on time-dependent vulnerability assessment model - case examples at two different type of aquifers

    NASA Astrophysics Data System (ADS)

    Živanović, Vladimir; Jemcov, Igor; Dragišić, Veselin; Atanacković, Nebojša

    2017-04-01

    Delineation of sanitary protection zones of groundwater source is a comprehensive and multidisciplinary task. Uniform methodology for protection zoning for various type of aquifers is not established. Currently applied methods mostly rely on horizontal groundwater travel time toward the tapping structure. On the other hand, groundwater vulnerability assessment methods evaluate the protective function of unsaturated zone as an important part of groundwater source protection. In some particular cases surface flow might also be important, because of rapid transfer of contaminants toward the zones with intense infiltration. For delineation of sanitary protection zones three major components should be analysed: vertical travel time through unsaturated zone, horizontal travel time through saturated zone and surface water travel time toward intense infiltration zones. Integrating the aforementioned components into one time-dependent model represents a basis of presented method for delineation of groundwater source protection zones in rocks and sediments of different porosity. The proposed model comprises of travel time components of surface water, as well as groundwater (horizontal and vertical component). The results obtained using the model, represent the groundwater vulnerability as the sum of the surface and groundwater travel time and corresponds to the travel time of potential contaminants from the ground surface to the tapping structure. This vulnerability assessment approach do not consider contaminant properties (intrinsic vulnerability) although it can be easily improved for evaluating the specific groundwater vulnerability. This concept of the sanitary protection zones was applied at two different type of aquifers: karstic aquifer of catchment area of Blederija springs and "Beli Timok" source of intergranular shallow aquifer. The first one represents a typical karst hydrogeological system with part of the catchment with allogenic recharge, and the second one

  18. Modeling Complex Cross-Systems Software Interfaces Using SysML

    NASA Technical Reports Server (NTRS)

    Mandutianu, Sanda; Morillo, Ron; Simpson, Kim; Liepack, Otfrid; Bonanne, Kevin

    2013-01-01

    The complex flight and ground systems for NASA human space exploration are designed, built, operated and managed as separate programs and projects. However, each system relies on one or more of the other systems in order to accomplish specific mission objectives, creating a complex, tightly coupled architecture. Thus, there is a fundamental need to understand how each system interacts with the other. To determine if a model-based system engineering approach could be utilized to assist with understanding the complex system interactions, the NASA Engineering and Safety Center (NESC) sponsored a task to develop an approach for performing cross-system behavior modeling. This paper presents the results of applying Model Based Systems Engineering (MBSE) principles using the System Modeling Language (SysML) to define cross-system behaviors and how they map to crosssystem software interfaces documented in system-level Interface Control Documents (ICDs).

  19. Phase equilibria constraints on models of subduction zone magmatism

    NASA Astrophysics Data System (ADS)

    Myers, James D.; Johnston, Dana A.

    Petrologic models of subduction zone magmatism can be grouped into three broad classes: (1) predominantly slab-derived, (2) mainly mantle-derived, and (3) multi-source. Slab-derived models assume high-alumina basalt (HAB) approximates primary magma and is derived by partial fusion of the subducting slab. Such melts must, therefore, be saturated with some combination of eclogite phases, e.g. cpx, garnet, qtz, at the pressures, temperatures and water contents of magma generation. In contrast, mantle-dominated models suggest partial melting of the mantle wedge produces primary high-magnesia basalts (HMB) which fractionate to yield derivative HAB magmas. In this context, HMB melts should be saturated with a combination of peridotite phases, i.e. ol, cpx and opx, and have liquid-lines-of-descent that produce high-alumina basalts. HAB generated in this manner must be saturated with a mafic phase assemblage at the intensive conditions of fractionation. Multi-source models combine slab and mantle components in varying proportions to generate the four main lava types (HMB, HAB, high-magnesia andesites (HMA) and evolved lavas) characteristic of subduction zones. The mechanism of mass transfer from slab to wedge as well as the nature and fate of primary magmas vary considerably among these models. Because of their complexity, these models imply a wide range of phase equilibria. Although the experiments conducted on calc-alkaline lavas are limited, they place the following limitations on arc petrologic models: (1) HAB cannot be derived from HMB by crystal fractionation at the intensive conditions thus far investigated, (2) HAB could be produced by anhydrous partial fusion of eclogite at high pressure, (3) HMB liquids can be produced by peridotite partial fusion 50-60 km above the slab-mantle interface, (4) HMA cannot be primary magmas derived by partial melting of the subducted slab, but could have formed by slab melt-peridotite interaction, and (5) many evolved calc

  20. Structure and Deformation in the Transpressive Zone of Southern California Inferred from Seismicity, Velocity, and Qp Models

    NASA Astrophysics Data System (ADS)

    Hauksson, E.; Shearer, P.

    2004-12-01

    We synthesize relocated regional seismicity and 3D velocity and Qp models to infer structure and deformation in the transpressive zone of southern California. These models provide a comprehensive synthesis of the tectonic fabric of the upper to middle crust, and the brittle ductile transition zone that in some cases extends into the lower crust. The regional seismicity patterns in southern California are brought into focus when the hypocenters are relocated using the double difference method. In detail, often the spatial correlation between background seismicity and late Quaternary faults is improved as the hypocenters become more clustered, and the spatial patterns are more sharply defined. Along some of the strike-slip faults the seismicity clusters decrease in width and form alignments implying that in many cases the clusters are associated with a single fault. In contrast, the Los Angeles Basin seismicity remains mostly scattered, reflecting a 3D distribution of the tectonic compression. We present the results of relocating 327,000 southern California earthquakes that occurred between 1984 and 2002. In particular, the depth distribution is improved and less affected by layer boundaries in velocity models or other similar artifacts, and thus improves the definition of the brittle ductile transition zone. The 3D VP and VP/VS models confirm existing tectonic interpretations and provide new insights into the configuration of the geological structures in southern California. The models extend from the US-Mexico border in the south to the Coast Ranges and Sierra Nevada in the north, and have 15 km horizontal grid spacing and an average vertical grid spacing of 4 km, down to 22 km depth. The heterogeneity of the crustal structure as imaged in both the VP and VP/VS models is larger within the Pacific than the North America plate, reflecting regional asymmetric variations in the crustal composition and past tectonic processes. Similarly, the relocated seismicity is

  1. Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO

    NASA Technical Reports Server (NTRS)

    Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael

    2014-01-01

    For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.

  2. Software Tools For Building Decision-support Models For Flood Emergency Situations

    NASA Astrophysics Data System (ADS)

    Garrote, L.; Molina, M.; Ruiz, J. M.; Mosquera, J. C.

    The SAIDA decision-support system was developed by the Spanish Ministry of the Environment to provide assistance to decision-makers during flood situations. SAIDA has been tentatively implemented in two test basins: Jucar and Guadalhorce, and the Ministry is currently planning to have it implemented in all major Spanish basins in a few years' time. During the development cycle of SAIDA, the need for providing as- sistance to end-users in model definition and calibration was clearly identified. System developers usually emphasise abstraction and generality with the goal of providing a versatile software environment. End users, on the other hand, require concretion and specificity to adapt the general model to their local basins. As decision-support models become more complex, the gap between model developers and users gets wider: Who takes care of model definition, calibration and validation?. Initially, model developers perform these tasks, but the scope is usually limited to a few small test basins. Before the model enters operational stage, end users must get involved in model construction and calibration, in order to gain confidence in the model recommendations. However, getting the users involved in these activities is a difficult task. The goal of this re- search is to develop representation techniques for simulation and management models in order to define, develop and validate a mechanism, supported by a software envi- ronment, oriented to provide assistance to the end-user in building decision models for the prediction and management of river floods in real time. The system is based on three main building blocks: A library of simulators of the physical system, an editor to assist the user in building simulation models, and a machine learning method to calibrate decision models based on the simulation models provided by the user.

  3. Radiofrequency ablation of liver metastases-software-assisted evaluation of the ablation zone in MDCT: tumor-free follow-up versus local recurrent disease.

    PubMed

    Keil, Sebastian; Bruners, Philipp; Schiffl, Katharina; Sedlmair, Martin; Mühlenbruch, Georg; Günther, Rolf W; Das, Marco; Mahnken, Andreas H

    2010-04-01

    The purpose of this study was to investigate differences in change of size and CT value between local recurrences and tumor-free areas after CT-guided radiofrequency ablation (RFA) of hepatic metastases during follow-up by means of dedicated software for automatic evaluation of hepatic lesions. Thirty-two patients with 54 liver metastases from breast or colorectal cancer underwent triphasic contrast-enhanced multidetector-row computed tomography (MDCT) to evaluate hepatic metastatic spread and localization before CT-guided RFA and for follow-up after intervention. Sixteen of these patients (65.1 + or - 10.3 years) with 30 metastases stayed tumor-free (group 1), while the other group (n = 16 with 24 metastases; 62.0 + or - 13.8 years) suffered from local recurrent disease (group 2). Applying an automated software tool (SyngoCT Oncology; Siemens Healthcare, Forchheim, Germany), size parameters (volume, RECIST, WHO) and attenuation were measured within the lesions before, 1 day after, and 28 days after RFA treatment. The natural logarithm (ln) of the quotient of the volume 1 day versus 28 days after RFA treament was computed: lnQ1//28/0(volume). Analogously, ln ratios of RECIST, WHO, and attenuation were computed and statistically evaluated by repeated-measures ANOVA. One lesion in group 2 was excluded from further evaluation due to automated missegmentation. Statistically significant differences between the two groups were observed with respect to initial volume, RECIST, and WHO (p < 0.05). Furthermore, ln ratios corresponding to volume, RECIST, and WHO differed significantly between the two groups. Attenuation evaluations showed no significant differences, but there was a trend toward attenuation assessment for the parameter lnQ28/0(attenuation) (p = 0.0527), showing higher values for group 1 (-0.4 + or - 0.3) compared to group 2 (-0.2 + or - 0.2). In conclusion, hepatic metastases and their zone of coagulation necrosis after RFA differed significantly between tumor

  4. Development of an irrigation scheduling software based on model predicted crop water stress

    USDA-ARS?s Scientific Manuscript database

    Modern irrigation scheduling methods are generally based on sensor-monitored soil moisture regimes rather than crop water stress which is difficult to measure in real-time, but can be computed using agricultural system models. In this study, an irrigation scheduling software based on RZWQM2 model pr...

  5. A kinematic model for the evolution of the Eastern California Shear Zone and Garlock Fault, Mojave Desert, California

    NASA Astrophysics Data System (ADS)

    Dixon, Timothy H.; Xie, Surui

    2018-07-01

    The Eastern California shear zone in the Mojave Desert, California, accommodates nearly a quarter of Pacific-North America plate motion. In south-central Mojave, the shear zone consists of six active faults, with the central Calico fault having the fastest slip rate. However, faults to the east of the Calico fault have larger total offsets. We explain this pattern of slip rate and total offset with a model involving a crustal block (the Mojave Block) that migrates eastward relative to a shear zone at depth whose position and orientation is fixed by the Coachella segment of the San Andreas fault (SAF), southwest of the transpressive "big bend" in the SAF. Both the shear zone and the Garlock fault are assumed to be a direct result of this restraining bend, and consequent strain redistribution. The model explains several aspects of local and regional tectonics, may apply to other transpressive continental plate boundary zones, and may improve seismic hazard estimates in these zones.

  6. Boundary conditions traps when modeling interseismic deformation at subduction zones

    NASA Astrophysics Data System (ADS)

    Contreras, Marcelo; Gerbault, Muriel; Tassara, Andres; Bataille, Klaus; Araya, Rodolfo

    2017-04-01

    In order to gain insight on the controling factors for elastic strain build-up in subduction zones, such as those triggering the Mw 8. 2010 Maule earthquake, we published a modeling study to test the influence of the subducting plate thickness, variations in the updip and downdip limit of a 100% locked interplate zone, elastic parameters, and velocity reduction at the base of the subducted slab (Contreras et al., Andean Geology 43(3), 2016). When comparing our modeled predictions with interseismic GPS observations, our results indicated little influence of the subducting plate thickness, but a necessity to reduce the velocity at the corner-base of the subducted slab below the trench region, to 10% of the far-field convergence rate. Complementary numerical models allowed us to link this velocity reduction at the base of subducting slab with a long-term high flexural stress resulting from the mechanical interaction of the slab with the underlying mantle. This study discusses that even if only a small amount of these high deviatoric stresses transfer energy towards the upper portion of the slab, it may participate in triggering large earthquakes such as the Mw8.8 Maule event. The definition of initial and boundary conditions between short-term to long-term models evidence the mechanical inconsistencies that may appear when considering pre-flexed subducting slabs and unloaded underlying asthenosphere, potentially creating mis-balanced large stress discontinuities.

  7. [Establishment of a 3D finite element model of human skull using MSCT images and mimics software].

    PubMed

    Huang, Ping; Li, Zheng-dong; Shao, Yu; Zou, Dong-hua; Liu, Ning-guo; Li, Li; Chen, Yuan-yuan; Wan, Lei; Chen, Yi-jiu

    2011-02-01

    To establish a human 3D finite element skull model, and to explore its value in biomechanics analysis. The cadaveric head was scanned and then 3D skull model was created using Mimics software based on 2D CT axial images. The 3D skull model was optimized by preprocessor along with creation of the surface and volume meshes. The stress changes, after the head was struck by an object or the head hit the ground directly, were analyzed using ANSYS software. The original 3D skull model showed a large number of triangles with a poor quality and high similarity with the real head, while the optimized model showed high quality surface and volume meshes with a small number of triangles comparatively. The model could show the local and global stress changes effectively. The human 3D skull model can be established using MSCT and Mimics software and provides a good finite element model for biomechanics analysis. This model may also provide a base for the study of head stress changes following different forces.

  8. Development and testing of a compartmentalized reaction network model for redox zones in contaminated aquifers

    USGS Publications Warehouse

    Abrams , Robert H.; Loague, Keith; Kent, Douglas B.

    1998-01-01

    The work reported here is the first part of a larger effort focused on efficient numerical simulation of redox zone development in contaminated aquifers. The sequential use of various electron acceptors, which is governed by the energy yield of each reaction, gives rise to redox zones. The large difference in energy yields between the various redox reactions leads to systems of equations that are extremely ill-conditioned. These equations are very difficult to solve, especially in the context of coupled fluid flow, solute transport, and geochemical simulations. We have developed a general, rational method to solve such systems where we focus on the dominant reactions, compartmentalizing them in a manner that is analogous to the redox zones that are often observed in the field. The compartmentalized approach allows us to easily solve a complex geochemical system as a function of time and energy yield, laying the foundation for our ongoing work in which we couple the reaction network, for the development of redox zones, to a model of subsurface fluid flow and solute transport. Our method (1) solves the numerical system without evoking a redox parameter, (2) improves the numerical stability of redox systems by choosing which compartment and thus which reaction network to use based upon the concentration ratios of key constituents, (3) simulates the development of redox zones as a function of time without the use of inhibition factors or switching functions, and (4) can reduce the number of transport equations that need to be solved in space and time. We show through the use of various model performance evaluation statistics that the appropriate compartment choice under different geochemical conditions leads to numerical solutions without significant error. The compartmentalized approach described here facilitates the next phase of this effort where we couple the redox zone reaction network to models of fluid flow and solute transport.

  9. Numerical modeling of fluid flow in a fault zone: a case of study from Majella Mountain (Italy).

    NASA Astrophysics Data System (ADS)

    Romano, Valentina; Battaglia, Maurizio; Bigi, Sabina; De'Haven Hyman, Jeffrey; Valocchi, Albert J.

    2017-04-01

    The study of fluid flow in fractured rocks plays a key role in reservoir management, including CO2 sequestration and waste isolation. We present a numerical model of fluid flow in a fault zone, based on field data acquired in Majella Mountain, in the Central Apennines (Italy). This fault zone is considered a good analogue for the massive presence of fluid migration in the form of tar. Faults are mechanical features and cause permeability heterogeneities in the upper crust, so they strongly influence fluid flow. The distribution of the main components (core, damage zone) can lead the fault zone to act as a conduit, a barrier, or a combined conduit-barrier system. We integrated existing information and our own structural surveys of the area to better identify the major fault features (e.g., type of fractures, statistical properties, geometrical and petro-physical characteristics). In our model the damage zones of the fault are described as discretely fractured medium, while the core of the fault as a porous one. Our model utilizes the dfnWorks code, a parallelized computational suite, developed at Los Alamos National Laboratory (LANL), that generates three dimensional Discrete Fracture Network (DFN) of the damage zones of the fault and characterizes its hydraulic parameters. The challenge of the study is the coupling between the discrete domain of the damage zones and the continuum one of the core. The field investigations and the basic computational workflow will be described, along with preliminary results of fluid flow simulation at the scale of the fault.

  10. COSP: Satellite simulation software for model assessment

    DOE PAGES

    Bodas-Salcedo, A.; Webb, M. J.; Bony, S.; ...

    2011-08-01

    Errors in the simulation of clouds in general circulation models (GCMs) remain a long-standing issue in climate projections, as discussed in the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report. This highlights the need for developing new analysis techniques to improve our knowledge of the physical processes at the root of these errors. The Cloud Feedback Model Intercomparison Project (CFMIP) pursues this objective, and under that framework the CFMIP Observation Simulator Package (COSP) has been developed. COSP is a flexible software tool that enables the simulation of several satellite-borne active and passive sensor observations from model variables. The flexibilitymore » of COSP and a common interface for all sensors facilitates its use in any type of numerical model, from high-resolution cloud-resolving models to the coarser-resolution GCMs assessed by the IPCC, and the scales in between used in weather forecast and regional models. The diversity of model parameterization techniques makes the comparison between model and observations difficult, as some parameterized variables (e.g., cloud fraction) do not have the same meaning in all models. The approach followed in COSP permits models to be evaluated against observations and compared against each other in a more consistent manner. This thus permits a more detailed diagnosis of the physical processes that govern the behavior of clouds and precipitation in numerical models. The World Climate Research Programme (WCRP) Working Group on Coupled Modelling has recommended the use of COSP in a subset of climate experiments that will be assessed by the next IPCC report. Here we describe COSP, present some results from its application to numerical models, and discuss future work that will expand its capabilities.« less

  11. Coastal zone environment measurements at Sakhalin Island using autonomous mobile robotic system

    NASA Astrophysics Data System (ADS)

    Tyugin, Dmitry; Kurkin, Andrey; Zaytsev, Andrey; Zeziulin, Denis; Makarov, Vladimir

    2017-04-01

    To perform continuous complex measurements of environment characteristics in coastal zones autonomous mobile robotic system was built. The main advantage of such system in comparison to manual measurements is an ability to quickly change location of the equipment and start measurements. AMRS allows to transport a set of sensors and appropriate power source for long distances. The equipment installed on the AMRS includes: a modern high-tech ship's radar «Micran» for sea waves measurements, multiparameter platform WXT 520 for weather monitoring, high precision GPS/GLONASS receiver OS-203 for georeferencing, laser scanner platform based on two Sick LMS-511 scanners which can provide 3D distance measurements in up to 80 meters on the AMRS route and rugged designed quad-core fanless computer Matrix MXE-5400 for data collecting and recording. The equipment is controlled by high performance modular software developed specially for the AMRS. During the summer 2016 the experiment was conducted. Measurements took place at the coastal zone of Sakhalin Island (Russia). The measuring system of AMRS was started in automatic mode controlled by the software. As result a lot of data was collected and processed to database. It consists of continuous measurements of the coastal zone including different weather conditions. The most interesting for investigation is a period of three-point storm detected on June, 2, 2016. Further work will relate to data processing of measured environment characteristics and numerical models verification based on the collected data. The presented results of research obtained by the support of the Russian president's scholarship for young scientists and graduate students №SP-193.2015.5

  12. Cognon Neural Model Software Verification and Hardware Implementation Design

    NASA Astrophysics Data System (ADS)

    Haro Negre, Pau

    Little is known yet about how the brain can recognize arbitrary sensory patterns within milliseconds using neural spikes to communicate information between neurons. In a typical brain there are several layers of neurons, with each neuron axon connecting to ˜104 synapses of neurons in an adjacent layer. The information necessary for cognition is contained in theses synapses, which strengthen during the learning phase in response to newly presented spike patterns. Continuing on the model proposed in "Models for Neural Spike Computation and Cognition" by David H. Staelin and Carl H. Staelin, this study seeks to understand cognition from an information theoretic perspective and develop potential models for artificial implementation of cognition based on neuronal models. To do so we focus on the mathematical properties and limitations of spike-based cognition consistent with existing neurological observations. We validate the cognon model through software simulation and develop concepts for an optical hardware implementation of a network of artificial neural cognons.

  13. The single-zone numerical model of homogeneous charge compression ignition engine performance

    NASA Astrophysics Data System (ADS)

    Fedyanov, E. A.; Itkis, E. M.; Kuzmin, V. N.; Shumskiy, S. N.

    2017-02-01

    The single-zone model of methane-air mixture combustion in the Homogeneous Charge Compression Ignition engine was developed. First modeling efforts resulted in the selection of the detailed kinetic reaction mechanism, most appropriate for the conditions of the HCCI process. Then, the model was completed so as to simulate the performance of the four-stroke engine and was coupled by physically reasonable adjusting functions. Validation of calculations against experimental data showed acceptable agreement.

  14. A UML-based metamodel for software evolution process

    NASA Astrophysics Data System (ADS)

    Jiang, Zuo; Zhou, Wei-Hong; Fu, Zhi-Tao; Xiong, Shun-Qing

    2014-04-01

    A software evolution process is a set of interrelated software processes under which the corresponding software is evolving. An object-oriented software evolution process meta-model (OO-EPMM), abstract syntax and formal OCL constraint of meta-model are presented in this paper. OO-EPMM can not only represent software development process, but also represent software evolution.

  15. Automated UHPLC separation of 10 pharmaceutical compounds using software-modeling.

    PubMed

    Zöldhegyi, A; Rieger, H-J; Molnár, I; Fekhretdinova, L

    2018-03-20

    Human mistakes are still one of the main reasons of underlying regulatory affairs that in a compliance with FDA's Data Integrity and Analytical Quality by Design (AQbD) must be eliminated. To develop smooth, fast and robust methods that are free of human failures, a state-of-the-art automation was presented. For the scope of this study, a commercial software (DryLab) and a model mixture of 10 drugs were subjected to testing. Following AQbD-principles, the best available working point was selected and conformational experimental runs, i.e. the six worst cases of the conducted robustness calculation, were performed. Simulated results were found to be in excellent agreement with the experimental ones, proving the usefulness and effectiveness of an automated, software-assisted analytical method development. Copyright © 2018. Published by Elsevier B.V.

  16. Modeling Water Flux at the Base of the Rooting Zone for Soils with Varying Glacial Parent Materials

    NASA Astrophysics Data System (ADS)

    Naylor, S.; Ellett, K. M.; Ficklin, D. L.; Olyphant, G. A.

    2013-12-01

    Soils of varying glacial parent materials in the Great Lakes Region (USA) are characterized by thin unsaturated zones and widespread use of agricultural pesticides and nutrients that affect shallow groundwater. To better our understanding of the fate and transport of contaminants, improved models of water fluxes through the vadose zones of various hydrogeologic settings are warranted. Furthermore, calibrated unsaturated zone models can be coupled with watershed models, providing a means for predicting the impact of varying climate scenarios on agriculture in the region. To address these issues, a network of monitoring sites was developed in Indiana that provides continuous measurements of precipitation, potential evapotranspiration (PET), soil volumetric water content (VWC), and soil matric potential to parameterize and calibrate models. Flux at the base of the root zone is simulated using two models of varying complexity: 1) the HYDRUS model, which numerically solves the Richards equation, and 2) the soil-water-balance (SWB) model, which assumes vertical flow under a unit gradient with infiltration and evapotranspiration treated as separate, sequential processes. Soil hydraulic parameters are determined based on laboratory data, a pedo-transfer function (ROSETTA), field measurements (Guelph permeameter), and parameter optimization. Groundwater elevation data are available at three of six sites to establish the base of the unsaturated zone model domain. Initial modeling focused on the groundwater recharge season (Nov-Feb) when PET is limited and much of the annual vertical flux occurs. HYDRUS results indicate that base of root zone fluxes at a site underlain by glacial ice-contact parent materials are 48% of recharge season precipitation (VWC RMSE=8.2%), while SWB results indicate that fluxes are 43% (VWC RMSE=3.7%). Due in part to variations in surface boundary conditions, more variable fluxes were obtained for a site underlain by alluvium with the SWB model (68

  17. Thermomechanical modeling of the Colorado Plateau-Basin and range transition zone

    NASA Technical Reports Server (NTRS)

    Londe, M. D.

    1985-01-01

    The Colorado Plateau (CP) basin and range (B & R) boundary is marked by a transition zone on the order of 75 to 150 km in width. As one moves westward across this transition from the CP interior to the B & R there is a variation in the surface topography, surface heat flow, Bouguer gravity, seismicity, and crustal structure. This transition extends eastward into the western CP from the Wastach-Hurricane fault line and is largely coincident with the high plateaus of Utah and the Wasatch Mountains. It has been suggested that this transition zone marks a thermal and tectonic encroachment of the CP by the B & R. A simple two dimensional numerical model of the thermal regime for the transition zone was constructed to test the hypothesis that the observed geophysical signatures across the transition are due to lateral heat conduction from steady state uniform extension within the B & R lithosphere. Surface heat flow, uplift due to flexure from thermal buoyant loading, and regional Bouguer gravity are computed for various extension rates, crustal structures, and compensation depths.

  18. An Open Software Platform for Sharing Water Resource Models, Code and Data

    NASA Astrophysics Data System (ADS)

    Knox, Stephen; Meier, Philipp; Mohamed, Khaled; Korteling, Brett; Matrosov, Evgenii; Huskova, Ivana; Harou, Julien; Rosenberg, David; Tilmant, Amaury; Medellin-Azuara, Josue; Wicks, Jon

    2016-04-01

    The modelling of managed water resource systems requires new approaches in the face of increasing future uncertainty. Water resources management models, even if applied to diverse problem areas, use common approaches such as representing the problem as a network of nodes and links. We propose a data management software platform, called Hydra, that uses this commonality to allow multiple models using a node-link structure to be managed and run using a single software system. Hydra's user interface allows users to manage network topology and associated data. Hydra feeds this data directly into a model, importing from and exporting to different file formats using Apps. An App connects Hydra to a custom model, a modelling system such as GAMS or MATLAB or to different file formats such as MS Excel, CSV and ESRI Shapefiles. Hydra allows users to manage their data in a single, consistent place. Apps can be used to run domain-specific models and allow users to work with their own required file formats. The Hydra App Store offers a collaborative space where model developers can publish, review and comment on Apps, models and data. Example Apps and open-source libraries are available in a variety of languages (Python, Java and .NET). The App Store can act as a hub for water resource modellers to view and share Apps, models and data easily. This encourages an ecosystem of development using a shared platform, resulting in more model integration and potentially greater unity within resource modelling communities. www.hydraplatform.org www.hydraappstore.com

  19. A measurement system for large, complex software programs

    NASA Technical Reports Server (NTRS)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  20. The SCEC Unified Community Velocity Model (UCVM) Software Framework for Distributing and Querying Seismic Velocity Models

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.

    2017-12-01

    Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications

  1. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    NASA Astrophysics Data System (ADS)

    de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.

    2014-10-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.

  2. EPA's Benchmark Dose Modeling Software

    EPA Science Inventory

    The EPA developed the Benchmark Dose Software (BMDS) as a tool to help Agency risk assessors facilitate applying benchmark dose (BMD) method’s to EPA’s human health risk assessment (HHRA) documents. The application of BMD methods overcomes many well know limitations ...

  3. En-face imaging of the ellipsoid zone in the retina from optical coherence tomography B-scans

    NASA Astrophysics Data System (ADS)

    Holmes, T.; Larkin, S.; Downing, M.; Csaky, K.

    2015-03-01

    It is generally believed that photoreceptor integrity is related to the ellipsoid zone appearance in optical coherence tomography (OCT) B-scans. Algorithms and software were developed for viewing and analyzing the ellipsoid zone. The software performs the following: (a), automated ellipsoid zone isolation in the B-scans, (b), en-face view of the ellipsoid-zone reflectance, (c), alignment and overlay of (b) onto reflectance images of the retina, and (d), alignment and overlay of (c) with microperimetry sensitivity points. Dataset groups were compared from normal and dry age related macular degeneration (DAMD) subjects. Scalar measurements for correlation against condition included the mean and standard deviation of the ellipsoid zone's reflectance. The imageprocessing techniques for automatically finding the ellipsoid zone are based upon a calculation of optical flow which tracks the edges of laminated structures across an image. Statistical significance was shown in T-tests of these measurements with the population pools separated as normal and DAMD subjects. A display of en-face ellipsoid-zone reflectance shows a clear and recognizable difference between any of the normal and DAMD subjects in that they show generally uniform and nonuniform reflectance, respectively, over the region near the macula. Regions surrounding points of low microperimetry (μP) sensitivity have nonregular and lower levels of ellipsoid-zone reflectance nearby. These findings support the idea that the photoreceptor integrity could be affecting both the ellipsoid-zone reflectance and the sensitivity measurements.

  4. NASA's TReK Project: A Case Study in Using the Spiral Model of Software Development

    NASA Technical Reports Server (NTRS)

    Hendrix, T. Dean; Schneider, Michelle P.

    1998-01-01

    Software development projects face numerous challenges that threaten their successful completion. Whether it is not enough money, too little time, or a case of "requirements creep" that has turned into a full sprint, projects must meet these challenges or face possible disastrous consequences. A robust, yet flexible process model can provide a mechanism through which software development teams can meet these challenges head on and win. This article describes how the spiral model has been successfully tailored to a specific project and relates some notable results to date.

  5. Association of parameter, software, and hardware variation with large-scale behavior across 57,000 climate models

    PubMed Central

    Knight, Christopher G.; Knight, Sylvia H. E.; Massey, Neil; Aina, Tolu; Christensen, Carl; Frame, Dave J.; Kettleborough, Jamie A.; Martin, Andrew; Pascoe, Stephen; Sanderson, Ben; Stainforth, David A.; Allen, Myles R.

    2007-01-01

    In complex spatial models, as used to predict the climate response to greenhouse gas emissions, parameter variation within plausible bounds has major effects on model behavior of interest. Here, we present an unprecedentedly large ensemble of >57,000 climate model runs in which 10 parameters, initial conditions, hardware, and software used to run the model all have been varied. We relate information about the model runs to large-scale model behavior (equilibrium sensitivity of global mean temperature to a doubling of carbon dioxide). We demonstrate that effects of parameter, hardware, and software variation are detectable, complex, and interacting. However, we find most of the effects of parameter variation are caused by a small subset of parameters. Notably, the entrainment coefficient in clouds is associated with 30% of the variation seen in climate sensitivity, although both low and high values can give high climate sensitivity. We demonstrate that the effect of hardware and software is small relative to the effect of parameter variation and, over the wide range of systems tested, may be treated as equivalent to that caused by changes in initial conditions. We discuss the significance of these results in relation to the design and interpretation of climate modeling experiments and large-scale modeling more generally. PMID:17640921

  6. Software Uncertainty in Integrated Environmental Modelling: the role of Semantics and Open Science

    NASA Astrophysics Data System (ADS)

    de Rigo, Daniele

    2013-04-01

    Computational aspects increasingly shape environmental sciences [1]. Actually, transdisciplinary modelling of complex and uncertain environmental systems is challenging computational science (CS) and also the science-policy interface [2-7]. Large spatial-scale problems falling within this category - i.e. wide-scale transdisciplinary modelling for environment (WSTMe) [8-10] - often deal with factors (a) for which deep-uncertainty [2,11-13] may prevent usual statistical analysis of modelled quantities and need different ways for providing policy-making with science-based support. Here, practical recommendations are proposed for tempering a peculiar - not infrequently underestimated - source of uncertainty. Software errors in complex WSTMe may subtly affect the outcomes with possible consequences even on collective environmental decision-making. Semantic transparency in CS [2,8,10,14,15] and free software [16,17] are discussed as possible mitigations (b) . Software uncertainty, black-boxes and free software. Integrated natural resources modelling and management (INRMM) [29] frequently exploits chains of nontrivial data-transformation models (D- TM), each of them affected by uncertainties and errors. Those D-TM chains may be packaged as monolithic specialized models, maybe only accessible as black-box executables (if accessible at all) [50]. For end-users, black-boxes merely transform inputs in the final outputs, relying on classical peer-reviewed publications for describing the internal mechanism. While software tautologically plays a vital role in CS, it is often neglected in favour of more theoretical aspects. This paradox has been provocatively described as "the invisibility of software in published science. Almost all published papers required some coding, but almost none mention software, let alone include or link to source code" [51]. Recently, this primacy of theory over reality [52-54] has been challenged by new emerging hybrid approaches [55] and by the

  7. Modeling of ultrasonic processes utilizing a generic software framework

    NASA Astrophysics Data System (ADS)

    Bruns, P.; Twiefel, J.; Wallaschek, J.

    2017-06-01

    Modeling of ultrasonic processes is typically characterized by a high degree of complexity. Different domains and size scales must be regarded, so that it is rather difficult to build up a single detailed overall model. Developing partial models is a common approach to overcome this difficulty. In this paper a generic but simple software framework is presented which allows to coupe arbitrary partial models by slave modules with well-defined interfaces and a master module for coordination. Two examples are given to present the developed framework. The first one is the parameterization of a load model for ultrasonically-induced cavitation. The piezoelectric oscillator, its mounting, and the process load are described individually by partial models. These partial models then are coupled using the framework. The load model is composed of spring-damper-elements which are parameterized by experimental results. In the second example, the ideal mounting position for an oscillator utilized in ultrasonic assisted machining of stone is determined. Partial models for the ultrasonic oscillator, its mounting, the simplified contact process, and the workpiece’s material characteristics are presented. For both applications input and output variables are defined to meet the requirements of the framework’s interface.

  8. An Extended Multi-Zone Model for the MCG-6-30-15 Warm Absorber

    NASA Technical Reports Server (NTRS)

    Morales, R.; Fabian, A. C.; Reynolds, C. S.

    2000-01-01

    The variable warm absorber seen with ASCA in the X-ray spectrum of MCG 6-30-15 shows complex time behaviour in which the optical depth of O VIII anticorrelates with the flux whereas that of O VII is unchanging. The explanation in terms of a two zone absorber has since been challenged by BeppoSAX observations. These present a more complicated behaviour for the O VII edge. The explanation we offer for both ASCA and BeppoSAX observations requires a very simple photoionization model together with the presence of a third, intermediate, zone and a period of very low luminosity. In practice warm absorbers are likely to be extended, multi-zone regions of which only part causes directly observable absorption edges at any given time depending on the value of the luminosity.

  9. NAPL source zone depletion model and its application to railroad-tank-car spills.

    PubMed

    Marruffo, Amanda; Yoon, Hongkyu; Schaeffer, David J; Barkan, Christopher P L; Saat, Mohd Rapik; Werth, Charles J

    2012-01-01

    We developed a new semi-analytical source zone depletion model (SZDM) for multicomponent light nonaqueous phase liquids (LNAPLs) and incorporated this into an existing screening model for estimating cleanup times for chemical spills from railroad tank cars that previously considered only single-component LNAPLs. Results from the SZDM compare favorably to those from a three-dimensional numerical model, and from another semi-analytical model that does not consider source zone depletion. The model was used to evaluate groundwater contamination and cleanup times for four complex mixtures of concern in the railroad industry. Among the petroleum hydrocarbon mixtures considered, the cleanup time of diesel fuel was much longer than E95, gasoline, and crude oil. This is mainly due to the high fraction of low solubility components in diesel fuel. The results demonstrate that the updated screening model with the newly developed SZDM is computationally efficient, and provides valuable comparisons of cleanup times that can be used in assessing the health and financial risk associated with chemical mixture spills from railroad-tank-car accidents. © 2011, The Author(s). Ground Water © 2011, National Ground Water Association.

  10. Announcing a Community Effort to Create an Information Model for Research Software Archives

    NASA Astrophysics Data System (ADS)

    Million, C.; Brazier, A.; King, T.; Hayes, A.

    2018-04-01

    An effort has started to create recommendations and standards for the archiving of planetary science research software. The primary goal is to define an information model that is consistent with OAIS standards.

  11. Clinical, information and business process modeling to promote development of safe and flexible software.

    PubMed

    Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn

    2006-09-01

    Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.

  12. Semi-automatic mapping of fault rocks on a Digital Outcrop Model, Gole Larghe Fault Zone (Southern Alps, Italy)

    NASA Astrophysics Data System (ADS)

    Vho, Alice; Bistacchi, Andrea

    2015-04-01

    A quantitative analysis of fault-rock distribution is of paramount importance for studies of fault zone architecture, fault and earthquake mechanics, and fluid circulation along faults at depth. Here we present a semi-automatic workflow for fault-rock mapping on a Digital Outcrop Model (DOM). This workflow has been developed on a real case of study: the strike-slip Gole Larghe Fault Zone (GLFZ). It consists of a fault zone exhumed from ca. 10 km depth, hosted in granitoid rocks of Adamello batholith (Italian Southern Alps). Individual seismogenic slip surfaces generally show green cataclasites (cemented by the precipitation of epidote and K-feldspar from hydrothermal fluids) and more or less well preserved pseudotachylytes (black when well preserved, greenish to white when altered). First of all, a digital model for the outcrop is reconstructed with photogrammetric techniques, using a large number of high resolution digital photographs, processed with VisualSFM software. By using high resolution photographs the DOM can have a much higher resolution than with LIDAR surveys, up to 0.2 mm/pixel. Then, image processing is performed to map the fault-rock distribution with the ImageJ-Fiji package. Green cataclasites and epidote/K-feldspar veins can be quite easily separated from the host rock (tonalite) using spectral analysis. Particularly, band ratio and principal component analysis have been tested successfully. The mapping of black pseudotachylyte veins is more tricky because the differences between the pseudotachylyte and biotite spectral signature are not appreciable. For this reason we have tested different morphological processing tools aimed at identifying (and subtracting) the tiny biotite grains. We propose a solution based on binary images involving a combination of size and circularity thresholds. Comparing the results with manually segmented images, we noticed that major problems occur only when pseudotachylyte veins are very thin and discontinuous. After

  13. Utilizing Visual Effects Software for Efficient and Flexible Isostatic Adjustment Modelling

    NASA Astrophysics Data System (ADS)

    Meldgaard, A.; Nielsen, L.; Iaffaldano, G.

    2017-12-01

    The isostatic adjustment signal generated by transient ice sheet loading is an important indicator of past ice sheet extent and the rheological constitution of the interior of the Earth. Finite element modelling has proved to be a very useful tool in these studies. We present a simple numerical model for 3D visco elastic Earth deformation and a new approach to the design of such models utilizing visual effects software designed for the film and game industry. The software package Houdini offers an assortment of optimized tools and libraries which greatly facilitate the creation of efficient numerical algorithms. In particular, we make use of Houdini's procedural work flow, the SIMD programming language VEX, Houdini's sparse matrix creation and inversion libraries, an inbuilt tetrahedralizer for grid creation, and the user interface, which facilitates effortless manipulation of 3D geometry. We mitigate many of the time consuming steps associated with the authoring of efficient algorithms from scratch while still keeping the flexibility that may be lost with the use of commercial dedicated finite element programs. We test the efficiency of the algorithm by comparing simulation times with off-the-shelf solutions from the Abaqus software package. The algorithm is tailored for the study of local isostatic adjustment patterns, in close vicinity to present ice sheet margins. In particular, we wish to examine possible causes for the considerable spatial differences in the uplift magnitude which are apparent from field observations in these areas. Such features, with spatial scales of tens of kilometres, are not resolvable with current global isostatic adjustment models, and may require the inclusion of local topographic features. We use the presented algorithm to study a near field area where field observations are abundant, namely, Disko Bay in West Greenland with the intention of constraining Earth parameters and ice thickness. In addition, we assess how local

  14. Cytoscape: a software environment for integrated models of biomolecular interaction networks.

    PubMed

    Shannon, Paul; Markiel, Andrew; Ozier, Owen; Baliga, Nitin S; Wang, Jonathan T; Ramage, Daniel; Amin, Nada; Schwikowski, Benno; Ideker, Trey

    2003-11-01

    Cytoscape is an open source software project for integrating biomolecular interaction networks with high-throughput expression data and other molecular states into a unified conceptual framework. Although applicable to any system of molecular components and interactions, Cytoscape is most powerful when used in conjunction with large databases of protein-protein, protein-DNA, and genetic interactions that are increasingly available for humans and model organisms. Cytoscape's software Core provides basic functionality to layout and query the network; to visually integrate the network with expression profiles, phenotypes, and other molecular states; and to link the network to databases of functional annotations. The Core is extensible through a straightforward plug-in architecture, allowing rapid development of additional computational analyses and features. Several case studies of Cytoscape plug-ins are surveyed, including a search for interaction pathways correlating with changes in gene expression, a study of protein complexes involved in cellular recovery to DNA damage, inference of a combined physical/functional interaction network for Halobacterium, and an interface to detailed stochastic/kinetic gene regulatory models.

  15. Software Engineering Improvement Activities/Plan

    NASA Technical Reports Server (NTRS)

    2003-01-01

    bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.

  16. Modeling the Migration of Fluids in Subduction Zones

    NASA Astrophysics Data System (ADS)

    Wilson, C. R.; Spiegelman, M.; Van Keken, P. E.; Vrijmoed, J. C.; Hacker, B. R.

    2011-12-01

    Fluids play a major role in the formation of arc volcanism and the generation of continental crust. Progressive dehydration reactions in the downgoing slab release fluids to the hot overlying mantle wedge, causing flux melting and the migration of melts to the volcanic front. While the qualitative concept is well established, the quantitative details of fluid release and especially that of fluid migration and generation of hydrous melting in the wedge is still poorly understood. Here we present new models of the fluid migration through the mantle wedge for subduction zones. We use an existing set of high resolution metamorphic models (van Keken et al, 2010) to predict the regions of water release from the sediments, upper and lower crust, and upper most mantle. We use this water flux as input for the fluid migration calculation based on new finite element models built on advanced computational libraries (FEniCS/PETSc) for efficient and flexible solution of coupled multi-physics problems. The first generation of one-way coupled models solves for the evolution of porosity and fluid-pressure/flux throughout the slab and wedge given solid flow, viscosity and thermal fields from separate solutions to the incompressible Stokes and energy equations in the mantle wedge. These solutions are verified by comparing to previous benchmark studies (van Keken et al, 2008) and global suites of thermal subduction models (Syracuse et al, 2010). Fluid flow depends on both permeability and the rheology of the slab-wedge system as interaction with rheological variability can induce additional pressure gradients that affect the fluid flow pathways. These non-linearities have been shown to explain laboratory-scale observations of melt band orientation in labratory experiments and numerical simulations of melt localization in shear bands (Katz et al 2006). Our second generation of models dispense with the pre-calculation of incompressible mantle flow and fully couple the now compressible

  17. IEEE Computer Society/Software Engineering Institute Software Process Achievement (SPA) Award 2009

    DTIC Science & Technology

    2011-03-01

    capabilities to our GDM. We also introduced software as a service ( SaaS ) as part our technology solutions and have further enhanced our ability to...model PROSPER Infosys production support methodology Q&P quality and productivity R&D research and development SaaS software as a service ... Software Development Life Cycle (SDLC) 23 Table 10: Scientific Estimation Coverage by Service Line 27 CMU/SEI-2011-TR-008 | vi CMU/SEI-2011

  18. Software package for modeling spin-orbit motion in storage rings

    NASA Astrophysics Data System (ADS)

    Zyuzin, D. V.

    2015-12-01

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 106-109 particles in a beam during 109 turns in an accelerator (about 1012-1015 integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin-orbit dynamics.

  19. Modeling water infiltration and pesticides transport in unsaturated zone of a sedimentary aquifer

    NASA Astrophysics Data System (ADS)

    Sidoli, Pauline; Angulo-Jaramillo, Rafael; Baran, Nicole; Lassabatère, Laurent

    2015-04-01

    Groundwater quality monitoring has become an important environmental, economic and community issue since increasing needs drinking water at the same time with high anthropic pressure on aquifers. Leaching of various contaminants as pesticide into the groundwater is closely bound to water infiltration in the unsaturated zone which whom solute transport can occur. Knowledge's about mechanisms involved in the transfer of pesticides in the deep unsaturated zone are lacking today. This study aims to evaluate and to model leaching of pesticides and metabolites in the unsaturated zone, very heterogeneous, of a fluvio-glacial aquifer, in the South-East of France, where contamination of groundwater resources by pesticides is frequently observed as a consequence of intensive agricultural activities. Water flow and pesticide transport were evaluated from column tests under unsaturated conditions and from adsorption batch experiments onto the predominant lithofacies collected, composed of a mixture of sand and gravel. A maize herbicide, S-metolachlor, applied on the study site and worldwide and its two major degradation products (metolachlor ethanesulfonic acid and metolachlor oxanilic acid) were studied here. A conservative tracer, bromide ion, was used to determine water dispersive parameters of porous media. Elution curves were obtained from pesticide concentrations analyzed by an ultra-performance liquid chromatography system interfaced to a triple quadrupole mass spectrometer and from bromide concentrations measured by ionic chromatography system. Experimental data were implemented into Hydrus to model flow and solute transfer through a 1D profile in the vadose zone. Nonequilibrium solute transport model based on dual-porosity model with mobile and immobile water is fitting correctly elution curves. Water dispersive parameters show flow pattern realized in the mobile phase. Exchanges between mobile and immobile water are very limited. Because of low adsorptions onto

  20. Customer Communication Challenges and Solutions in Globally Distributed Agile Software Development

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna; Korkala, Mikko

    Working in the globally distributed market is one of the key trends among the software organizations all over the world. [1-5]. Several factors have contributed to the growth of distributed software development; time-zone independent ”follow the sun” development, access to well-educated labour, maturation of the technical infrastructure and reduced costs are some of the most commonly cited benefits of distributed development [3, 6-8]. Furthermore, customers are often located in different countries because of the companies’ internationalization purposes or good market opportunities.

  1. Modeling some two-dimensional relativistic phenomena using an educational interactive graphics software

    NASA Astrophysics Data System (ADS)

    Sastry, G. P.; Ravuri, Tushar R.

    1990-11-01

    This paper describes several relativistic phenomena in two spatial dimensions that can be modeled using the collision program of Spacetime Software. These include the familiar aberration, the Doppler effect, the headlight effect, and the invariance of the speed of light in vacuum, in addition to the rather unfamiliar effects like the dragging of light in a moving medium, reflection at moving mirrors, Wigner rotation of noncommuting boosts, and relativistic rotation of shrinking and expanding rods. All these phenomena are exhibited by tracings of composite computer printouts of the collision movie. It is concluded that an interactive educational graphics software with pleasing visuals can have considerable investigative power packed within it.

  2. Comparison of modeled traffic exposure zones using on-road air pollution measurements

    EPA Science Inventory

    Modeled traffic data were used to develop traffic exposure zones (TEZs) such as traffic delay, high volume, and transit routes in the Research Triangle area of North Carolina (USA). On-road air pollution measurements of nitrogen dioxide (NO2), carbon monoxide (CO), carbon dioxid...

  3. Examples of testing global identifiability of biological and biomedical models with the DAISY software.

    PubMed

    Saccomani, Maria Pia; Audoly, Stefania; Bellu, Giuseppina; D'Angiò, Leontina

    2010-04-01

    DAISY (Differential Algebra for Identifiability of SYstems) is a recently developed computer algebra software tool which can be used to automatically check global identifiability of (linear and) nonlinear dynamic models described by differential equations involving polynomial or rational functions. Global identifiability is a fundamental prerequisite for model identification which is important not only for biological or medical systems but also for many physical and engineering systems derived from first principles. Lack of identifiability implies that the parameter estimation techniques may not fail but any obtained numerical estimates will be meaningless. The software does not require understanding of the underlying mathematical principles and can be used by researchers in applied fields with a minimum of mathematical background. We illustrate the DAISY software by checking the a priori global identifiability of two benchmark nonlinear models taken from the literature. The analysis of these two examples includes comparison with other methods and demonstrates how identifiability analysis is simplified by this tool. Thus we illustrate the identifiability analysis of other two examples, by including discussion of some specific aspects related to the role of observability and knowledge of initial conditions in testing identifiability and to the computational complexity of the software. The main focus of this paper is not on the description of the mathematical background of the algorithm, which has been presented elsewhere, but on illustrating its use and on some of its more interesting features. DAISY is available on the web site http://www.dei.unipd.it/ approximately pia/. 2010 Elsevier Ltd. All rights reserved.

  4. Engine structures analysis software: Component Specific Modeling (COSMO)

    NASA Astrophysics Data System (ADS)

    McKnight, R. L.; Maffeo, R. J.; Schwartz, S.

    1994-08-01

    A component specific modeling software program has been developed for propulsion systems. This expert program is capable of formulating the component geometry as finite element meshes for structural analysis which, in the future, can be spun off as NURB geometry for manufacturing. COSMO currently has geometry recipes for combustors, turbine blades, vanes, and disks. Component geometry recipes for nozzles, inlets, frames, shafts, and ducts are being added. COSMO uses component recipes that work through neutral files with the Technology Benefit Estimator (T/BEST) program which provides the necessary base parameters and loadings. This report contains the users manual for combustors, turbine blades, vanes, and disks.

  5. Engine Structures Analysis Software: Component Specific Modeling (COSMO)

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Maffeo, R. J.; Schwartz, S.

    1994-01-01

    A component specific modeling software program has been developed for propulsion systems. This expert program is capable of formulating the component geometry as finite element meshes for structural analysis which, in the future, can be spun off as NURB geometry for manufacturing. COSMO currently has geometry recipes for combustors, turbine blades, vanes, and disks. Component geometry recipes for nozzles, inlets, frames, shafts, and ducts are being added. COSMO uses component recipes that work through neutral files with the Technology Benefit Estimator (T/BEST) program which provides the necessary base parameters and loadings. This report contains the users manual for combustors, turbine blades, vanes, and disks.

  6. Prediction and assimilation of surf-zone processes using a Bayesian network: Part I: Forward models

    USGS Publications Warehouse

    Plant, Nathaniel G.; Holland, K. Todd

    2011-01-01

    Prediction of coastal processes, including waves, currents, and sediment transport, can be obtained from a variety of detailed geophysical-process models with many simulations showing significant skill. This capability supports a wide range of research and applied efforts that can benefit from accurate numerical predictions. However, the predictions are only as accurate as the data used to drive the models and, given the large temporal and spatial variability of the surf zone, inaccuracies in data are unavoidable such that useful predictions require corresponding estimates of uncertainty. We demonstrate how a Bayesian-network model can be used to provide accurate predictions of wave-height evolution in the surf zone given very sparse and/or inaccurate boundary-condition data. The approach is based on a formal treatment of a data-assimilation problem that takes advantage of significant reduction of the dimensionality of the model system. We demonstrate that predictions of a detailed geophysical model of the wave evolution are reproduced accurately using a Bayesian approach. In this surf-zone application, forward prediction skill was 83%, and uncertainties in the model inputs were accurately transferred to uncertainty in output variables. We also demonstrate that if modeling uncertainties were not conveyed to the Bayesian network (i.e., perfect data or model were assumed), then overly optimistic prediction uncertainties were computed. More consistent predictions and uncertainties were obtained by including model-parameter errors as a source of input uncertainty. Improved predictions (skill of 90%) were achieved because the Bayesian network simultaneously estimated optimal parameters while predicting wave heights.

  7. Automated support for experience-based software management

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.

    1992-01-01

    To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.

  8. Validating the Performance of the FHWA Work Zone Model Version 1.0: A Case Study Along I-91 in Springfield, Massachusetts

    DOT National Transportation Integrated Search

    2017-08-01

    Central to the effective design of work zones is being able to understand how drivers behave as they approach and enter a work zone area. States use simulation tools in modeling freeway work zones to predict work zone impacts and to select optimal de...

  9. Modeling of tropospheric NO2 column over different climatic zones and land use/land cover types in South Asia

    NASA Astrophysics Data System (ADS)

    ul-Haq, Zia; Rana, Asim Daud; Tariq, Salman; Mahmood, Khalid; Ali, Muhammad; Bashir, Iqra

    2018-03-01

    We have applied regression analyses for the modeling of tropospheric NO2 (tropo-NO2) as the function of anthropogenic nitrogen oxides (NOx) emissions, aerosol optical depth (AOD), and some important meteorological parameters such as temperature (Temp), precipitation (Preci), relative humidity (RH), wind speed (WS), cloud fraction (CLF) and outgoing long-wave radiation (OLR) over different climatic zones and land use/land cover types in South Asia during October 2004-December 2015. Simple linear regression shows that, over South Asia, tropo-NO2 variability is significantly linked to AOD, WS, NOx, Preci and CLF. Also zone-5, consisting of tropical monsoon areas of eastern India and Myanmar, is the only study zone over which all the selected parameters show their influence on tropo-NO2 at statistical significance levels. In stepwise multiple linear modeling, tropo-NO2 column over landmass of South Asia, is significantly predicted by the combination of RH (standardized regression coefficient, β = - 49), AOD (β = 0.42) and NOx (β = 0.25). The leading predictors of tropo-NO2 columns over zones 1-5 are OLR, AOD, Temp, OLR, and RH respectively. Overall, as revealed by the higher correlation coefficients (r), the multiple regressions provide reasonable models for tropo-NO2 over South Asia (r = 0.82), zone-4 (r = 0.90) and zone-5 (r = 0.93). The lowest r (of 0.66) has been found for hot semi-arid region in northwestern Indus-Ganges Basin (zone-2). The highest value of β for urban area AOD (of 0.42) is observed for megacity Lahore, located in warm semi-arid zone-2 with large scale crop-residue burning, indicating strong influence of aerosols on the modeled tropo-NO2 column. A statistical significant correlation (r = 0.22) at the 0.05 level is found between tropo-NO2 and AOD over Lahore. Also NOx emissions appear as the highest contributor (β = 0.59) for modeled tropo-NO2 column over megacity Dhaka.

  10. Reanalysis of water and carbon cycle models at a critical zone observatory

    USDA-ARS?s Scientific Manuscript database

    The Susquehanna Shale Hills Critical Zone Observatory (SSHCZO) is a forested, hill-slope catchment located in the temperate-climate of central Pennsylvania with an extensive network of ground-based instrumentation for model testing and development. In this paper we discuss the use of multi-state fi...

  11. Quasi 3D modeling of water flow in vadose zone and groundwater

    USDA-ARS?s Scientific Manuscript database

    The complexity of subsurface flow systems calls for a variety of concepts leading to the multiplicity of simplified flow models. One habitual simplification is based on the assumption that lateral flow and transport in unsaturated zone are not significant unless the capillary fringe is involved. In ...

  12. Modeling the influence of coupled mass transfer processes on mass flux downgradient of heterogeneous DNAPL source zones

    NASA Astrophysics Data System (ADS)

    Yang, Lurong; Wang, Xinyu; Mendoza-Sanchez, Itza; Abriola, Linda M.

    2018-04-01

    Sequestered mass in low permeability zones has been increasingly recognized as an important source of organic chemical contamination that acts to sustain downgradient plume concentrations above regulated levels. However, few modeling studies have investigated the influence of this sequestered mass and associated (coupled) mass transfer processes on plume persistence in complex dense nonaqueous phase liquid (DNAPL) source zones. This paper employs a multiphase flow and transport simulator (a modified version of the modular transport simulator MT3DMS) to explore the two- and three-dimensional evolution of source zone mass distribution and near-source plume persistence for two ensembles of highly heterogeneous DNAPL source zone realizations. Simulations reveal the strong influence of subsurface heterogeneity on the complexity of DNAPL and sequestered (immobile/sorbed) mass distribution. Small zones of entrapped DNAPL are shown to serve as a persistent source of low concentration plumes, difficult to distinguish from other (sorbed and immobile dissolved) sequestered mass sources. Results suggest that the presence of DNAPL tends to control plume longevity in the near-source area; for the examined scenarios, a substantial fraction (43.3-99.2%) of plume life was sustained by DNAPL dissolution processes. The presence of sorptive media and the extent of sorption non-ideality are shown to greatly affect predictions of near-source plume persistence following DNAPL depletion, with plume persistence varying one to two orders of magnitude with the selected sorption model. Results demonstrate the importance of sorption-controlled back diffusion from low permeability zones and reveal the importance of selecting the appropriate sorption model for accurate prediction of plume longevity. Large discrepancies for both DNAPL depletion time and plume longevity were observed between 2-D and 3-D model simulations. Differences between 2- and 3-D predictions increased in the presence of

  13. Three-dimensional models of elastostatic deformation in heterogeneous media, with applications to the Eastern California Shear Zone

    NASA Astrophysics Data System (ADS)

    Barbot, Sylvain; Fialko, Yuri; Sandwell, David

    2009-10-01

    We present a semi-analytic iterative procedure for evaluating the 3-D deformation due to faults in an arbitrarily heterogeneous elastic half-space. Spatially variable elastic properties are modelled with equivalent body forces and equivalent surface traction in a `homogenized' elastic medium. The displacement field is obtained in the Fourier domain using a semi-analytic Green function. We apply this model to investigate the response of 3-D compliant zones (CZ) around major crustal faults to coseismic stressing by nearby earthquakes. We constrain the two elastic moduli, as well as the geometry of the fault zones by comparing the model predictions to Synthetic Aperture Radar inferferometric (InSAR) data. Our results confirm that the CZ models for the Rodman, Calico and Pinto Mountain faults in the Eastern California Shear Zone (ECSZ) can explain the coseismic InSAR data from both the Landers and the Hector Mine earthquakes. For the Pinto Mountain fault zone, InSAR data suggest a 50 per cent reduction in effective shear modulus and no significant change in Poisson's ratio compared to the ambient crust. The large wavelength of coseismic line-of-sight displacements around the Pinto Mountain fault requires a fairly wide (~1.9 km) CZ extending to a depth of at least 9 km. Best fit for the Calico CZ, north of Galway Dry Lake, is obtained for a 4km deep structure, with a 60 per cent reduction in shear modulus, with no change in Poisson's ratio. We find that the required effective rigidity of the Calico fault zone south of Galway Dry Lake is not as low as that of the northern segment, suggesting along-strike variations of effective elastic moduli within the same fault zone. The ECSZ InSAR data is best explained by CZ models with reduction in both shear and bulk moduli. These observations suggest pervasive and widespread damage around active crustal faults.

  14. A data model for clinical legal medicine practice and the development of a dedicated software for both practitioners and researchers.

    PubMed

    Dang, Catherine; Phuong, Thomas; Beddag, Mahmoud; Vega, Anabel; Denis, Céline

    2018-07-01

    To present a data model for clinical legal medicine and the software based on that data model for both practitioners and researchers. The main functionalities of the presented software are computer-assisted production of medical certificates and data capture, storage and retrieval. The data model and the software were jointly developed by the department of forensic medicine of the Jean Verdier Hospital (Bondy, France) and an bioinformatics laboratory (LIMICS, Paris universities 6-13) between November 2015 and May 2016. The data model was built based on four sources: i) a template used in our department for producing standardised medical certificates; ii) a random sample of medical certificates produced by the forensic department; iii) anterior consensus between four healthcare professionals (two forensic practitioners, a psychologist and a forensic psychiatrist) and iv) anatomical dictionaries. The trial version of the open source software was first designed for examination of physical assault survivors. An UML-like data model dedicated to clinical legal practice was built. The data model describes the terminology for examinations of sexual assault survivors, physical assault survivors, individuals kept in police custody and undocumented migrants for age estimation. A trial version of a software relying on the data model was developed and tested by three physicians. The software allows files archiving, standardised data collection, extraction and assistance for certificate generation. It can be used for research purpose, by data exchange and analysis. Despite some current limitations of use, it is a tool which can be shared and used by other departments of forensic medicine and other specialties, improving data management and exploitation. Full integration with external sources, analytics software and use of a semantic interoperability framework are planned for the next months. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights

  15. Predicting Software Suitability Using a Bayesian Belief Network

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  16. RuleMonkey: software for stochastic simulation of rule-based models

    PubMed Central

    2010-01-01

    Background The system-level dynamics of many molecular interactions, particularly protein-protein interactions, can be conveniently represented using reaction rules, which can be specified using model-specification languages, such as the BioNetGen language (BNGL). A set of rules implicitly defines a (bio)chemical reaction network. The reaction network implied by a set of rules is often very large, and as a result, generation of the network implied by rules tends to be computationally expensive. Moreover, the cost of many commonly used methods for simulating network dynamics is a function of network size. Together these factors have limited application of the rule-based modeling approach. Recently, several methods for simulating rule-based models have been developed that avoid the expensive step of network generation. The cost of these "network-free" simulation methods is independent of the number of reactions implied by rules. Software implementing such methods is now needed for the simulation and analysis of rule-based models of biochemical systems. Results Here, we present a software tool called RuleMonkey, which implements a network-free method for simulation of rule-based models that is similar to Gillespie's method. The method is suitable for rule-based models that can be encoded in BNGL, including models with rules that have global application conditions, such as rules for intramolecular association reactions. In addition, the method is rejection free, unlike other network-free methods that introduce null events, i.e., steps in the simulation procedure that do not change the state of the reaction system being simulated. We verify that RuleMonkey produces correct simulation results, and we compare its performance against DYNSTOC, another BNGL-compliant tool for network-free simulation of rule-based models. We also compare RuleMonkey against problem-specific codes implementing network-free simulation methods. Conclusions RuleMonkey enables the simulation of

  17. Stability and instability of thermocapillary convection in models of the float-zone crystal-growth process

    NASA Technical Reports Server (NTRS)

    Neitzel, G. P.

    1993-01-01

    This project was concerned with the determination of conditions of guaranteed stability and instability for thermocapillary convection in a model of the float-zone crystal-growth process. This model, referred to as the half-zone, was studied extensively, both experimentally and theoretically. Our own earlier research determined, using energy-stability theory, sufficient conditions for stability to axisymmetric disturbances. Nearly all results computed were for the case of a liquid with Prandtl Number Pr = 1. Attempts to compute cases for higher Prandtl numbers to allow comparison with the experimental results of other researchers were unsuccessful, but indicated that the condition guaranteeing stability against axisymmetric disturbances would be a value of the Marangoni number (Ma), significantly higher than that at which oscillatory convection was observed experimentally. Thus, additional results were needed to round out the stability picture for this model problem. The research performed under this grant consisted of the following: (1) computation of energy-stability limits for non-axisymmetric disturbances; (2) computation of linear-stability limits for axisymmetric and non-axisymmetric disturbances; (3) numerical simulation of the basic state for half- and full-zones with a deformable free surface; and (4) incorporation of radiation heat transfer into a model energy-stability problem. Each of these is summarized briefly below.

  18. Geodynamic Modeling of the Subduction Zone around the Japanese Islands

    NASA Astrophysics Data System (ADS)

    Honda, S.

    2017-06-01

    In this review, which focuses on our research, we describe the development of the thermomechanical modeling of subduction zones, paying special attention to those around the Japanese Islands. Without a sufficient amount of data and observations, models tended to be conceptual and general. However, the increasing power of computational tools has resulted in simple analytical and numerical models becoming more realistic, by incorporating the mantle flow around the subducting slab. The accumulation of observations and data has made it possible to construct regional models to understand the detail of the subduction processes. Recent advancements in the study of the seismic tomography and geology around the Japanese Islands has enabled new aspects of modeling the mantle processes. A good correlation between the seismic velocity anomalies and the finger-like distribution of volcanoes in northeast Japan has been recognized and small-scale convection (SSC) in the mantle wedge has been proposed to explain such a feature. The spatial and temporal evolution of the distribution of past volcanoes may reflect the characteristics of the flow in the mantle wedge, and points to the possibility of the flip-flopping of the finger-like pattern of the volcano distribution and the migration of volcanic activity from the back-arc side to the trench side. These observations are found to be qualitatively consistent with the results of the SSC model. We have also investigated the expected seismic anisotropy in the presence of SSC. The fast direction of the P-wave anisotropy generally shows the trench-normal direction with a reduced magnitude compared to the case without SSC. An analysis of full 3D seismic anisotropy is necessary to confirm the existence and nature of SSC. The 3D mantle flow around the subduction zone of plate-size scale has been modeled. It was found that the trench-parallel flow in the sub-slab mantle around the northern edge of the Pacific plate at the junction between

  19. Ascent/Descent Software

    NASA Technical Reports Server (NTRS)

    Brown, Charles; Andrew, Robert; Roe, Scott; Frye, Ronald; Harvey, Michael; Vu, Tuan; Balachandran, Krishnaiyer; Bly, Ben

    2012-01-01

    The Ascent/Descent Software Suite has been used to support a variety of NASA Shuttle Program mission planning and analysis activities, such as range safety, on the Integrated Planning System (IPS) platform. The Ascent/Descent Software Suite, containing Ascent Flight Design (ASC)/Descent Flight Design (DESC) Configuration items (Cis), lifecycle documents, and data files used for shuttle ascent and entry modeling analysis and mission design, resides on IPS/Linux workstations. A list of tools in Navigation (NAV)/Prop Software Suite represents tool versions established during or after the IPS Equipment Rehost-3 project.

  20. Finite element models of earthquake cycles in mature strike-slip fault zones

    NASA Astrophysics Data System (ADS)

    Lynch, John Charles

    The research presented in this dissertation is on the subject of strike-slip earthquakes and the stresses that build and release in the Earth's crust during earthquake cycles. Numerical models of these cycles in a layered elastic/viscoelastic crust are produced using the finite element method. A fault that alternately sticks and slips poses a particularly challenging problem for numerical implementation, and a new contact element dubbed the "Velcro" element was developed to address this problem (Appendix A). Additionally, the finite element code used in this study was bench-marked against analytical solutions for some simplified problems (Chapter 2), and the resolving power was tested for the fault region of the models (Appendix B). With the modeling method thus developed, there are two main questions posed. First, in Chapter 3, the effect of a finite-width shear zone is considered. By defining a viscoelastic shear zone beneath a periodically slipping fault, it is found that shear stress concentrates at the edges of the shear zone and thus causes the stress tensor to rotate into non-Andersonian orientations. Several methods are used to examine the stress patterns, including the plunge angles of the principal stresses and a new method that plots the stress tensor in a manner analogous to seismic focal mechanism diagrams. In Chapter 4, a simple San Andreas-like model is constructed, consisting of two great earthquake producing faults separated by a freely-slipping shorter fault. The model inputs of lower crustal viscosity, fault separation distance, and relative breaking strengths are examined for their effect on fault communication. It is found that with a lower crustal viscosity of 1018 Pa s (in the lower range of estimates for California), the two faults tend to synchronize their earthquake cycles, even in the cases where the faults have asymmetric breaking strengths. These models imply that postseismic stress transfer over hundreds of kilometers may play a