共查询到20条相似文献,搜索用时 250 毫秒
1.
2.
Jaros?aw Jasiewicz 《Computers & Geosciences》2011,37(9):1525-1531
GIS systems are frequently coupled with fuzzy logic systems implemented in statistical packages. For large GIS data sets including millions or tens of millions of cells, such an approach is relatively time-consuming. For very large data sets there is also an input/output bottleneck between the GIS and external software. The aim of this paper is to present low-level implementation of Mamdani’s fuzzy inference system designed to work with massive GIS data sets, using the GRASS GIS raster data processing engine. 相似文献
3.
Development and interpretation of morphometric maps are important tools in studies related to neotectonics and geomorphology; Geographic Information Systems (GIS) allows speed and precision to this process, but applied methodology will vary according to available tools and degree of knowledge of each researcher about involved software.A methodology to integrate GIS and statistics in morphometric analysis is presented for the most usual morphometric parameters—hypsometry, slope, aspect, swath profiles, lineaments and drainage density, surface roughness, isobase and hydraulic gradient.The GIS used was the Geographic Resources Analysis Support System (GRASS-GIS), an open-source project that offers an integrated environment for raster and vector analysis, image processing and maps/graphics creation. Statistical analysis of parameters can be carried out on R, a system for statistical computation and graphics, through an interface with GRASS that allows raster maps and points files to be treated as variables for analysis.The basic element for deriving morphometric maps is the digital elevation model (DEM). It can be interpolated from scattered points or contours, either in raster or vector format; it is also possible to use DEMs from NASA's Shuttle Radar Topographic Mission, with 30 m of ground resolution for the USA and 90 m for other countries.Proposed methodology can be adapted according to necessities and available tools. The use of free and open-source tools guarantees access to everyone, and its increasing popularization opens new development perspectives in this research field. 相似文献
4.
5.
Data I/O has become a major bottleneck of computational performance of geospatial analysis and modeling. In this study, a parallel GeoTIFF I/O library (pGTIOL) was developed. Through the storage mapping and data arrangement techniques, pGTIOL can operate on files in either strip or tile storage mode, read/write any sub-domain of data within the raster dataset. pGTIOL enables asynchronized I/O, which means a process can read/write its own sub-domains of data when necessary without synchronizing with other processes. pGTIOL was integrated into the parallel raster processing library (pRPL). Several pGTIOL-based data I/O functions and options were added to pRPL, while the existing functions of pRPL stay intact. Experiments showed that the integration of pRPL and pGTIOL achieved higher performance than the original pRPL that uses GDAL as the I/O interface. Therefore, pRPL + pGTIOL enables transparent parallelism for high-performance raster processing with the capability of true parallel I/O of massive raster datasets. 相似文献
6.
Jeffrey S. Rosenthal 《Computational statistics & data analysis》2007,51(12):5467-5470
We describe AMCMC, a software package for running adaptive MCMC algorithms on user-supplied density functions. AMCMC provides the user with an R interface, which in turn calls C programs for faster computations. The user can supply the density and functionals either as R objects, or as auxiliary C files. We describe experiments which illustrate that for fast performance in high dimensions, it is best that the latter option be used. 相似文献
7.
Martelli S Lopomo N Greggio S Ferretti E Visani A 《Computer methods and programs in biomedicine》2006,83(1):50-56
This paper describes a new software environment for advanced analysis of diarthrodial joints. The new tool provides a number of elaboration functions to investigate the joint kinematics, bone anatomy, and ligament and tendon properties. In particular, the shapes and the contact points of the articulating surfaces can be displayed and analysed through 2D user-defined sections and fittings (lines or conics). Ligament behaviour can be evaluated during joint movement, through the computation of elongations, orientations, and fiber strain. Motion trajectories can be also analysed through the calculation of helical axes, instantaneous rotations, and displacements in specific user-chosen coordinate reference frames. The software has an user-friendly graphical interface to display four-dimensional data (time-space data) obtained from medical images, navigation systems, spatial linkages or digitalizers, and can also generate printable reports and multiple graphs as well as ASCII files that can be imported to spreadsheet programs such as Microsoft Excel. 相似文献
8.
演绎数据库管理系统DEDBMS是工程数据库管理系统EDBMS/2中的一个子系统。本文着重介绍其产生式语言用户界面、工作流程及功能特点。DEDBMS系统的开发,使EDBMS/2不仅具有工程数据据管理功能,而且具有数据演绎和知识处理能力,从而成为CAD/CAM系统集成化和智能化的有力支撑工具。 相似文献
9.
飞行器几何数据入库的关键,是在对图形文件结构分析研究的基础上,开发出CAD软件的图形数据和数据库之间的接口程序,使得图形数据和数据库数据可以自由交换。论文研究了IGES文件格式和DXF文件格式的详细结构,开发出具有IGES文件和DXF文件入库与出库功能的飞行器几何数据库管理系统软件。 相似文献
10.
基于GIS平台的R树索引模型研究与实现 总被引:1,自引:0,他引:1
本文阐述了一种GIS平台-GRASS平台的基本结构和特点,分析了GRASS矢量图的文件结构及空间索引的几种基本算法,阐明了建立R树索引机制的一些基本思路和方法,提出了在GRASS平台上建立R树索引模型的实现策略。 相似文献
11.
针对栅格式GIS-ERDAS和二维泥石流流团模型系统(DEBRIS),从模型参数自动获取、数据处理、GIS与模型系统的集成策略与操作机制、模型产生的多时相泥石流数据的GIS管理与可视化表达、泥石流模拟数据的深层次分析等多个方面探讨了两者之间的系统集成问题。依此为基础,设计并实现了系统间的有效连接。 相似文献
12.
一个基于CORBA的异构数据源集成系统的设计 总被引:28,自引:0,他引:28
提出一个基于CORBA(common object request broker architecture)的即插即用的异构多数据源集成系统的设计方案.由于采用具有较强描述能力的OIM(object model for integration)对象模型作为集成系统的公共数据模型,该系统不仅能集成各种异构数据源,包括数据库系统、文件系统、WWW上HTML文件中的数据,而且能集成随时插入的新数据源中的数据.着重讨论系统的总体结构、OIM对象模型、查询处理及界面设计. 相似文献
13.
14.
针对海量遥感影像数据有效管理需求,在国产数据库管理系统X-Base内核上,研究并实现空间栅格数据扩展机制。设计基于X-Base关系数据库的空间栅格扩展体系结构,建立了栅格数据扩展体系结构模型,提出基于XML方式的空间栅格数据管理方法。使得扩展后的X-Base数据库能够有效存储、管理空间栅格数据,从而进一步扩展X-Base的功能。 相似文献
15.
Future factories will feature strong integration of physical machines and cyber-enabled software, working seamlessly to improve manufacturing production efficiency. In these digitally enabled and network connected factories, each physical machine on the shop floor can have its ‘virtual twin’ available in cyberspace. This ‘virtual twin’ is populated with data streaming in from the physical machines to represent a near real-time as-is state of the machine in cyberspace. This results in the virtualization of a machine resource to external factory manufacturing systems. This paper describes how streaming data can be stored in a scalable and flexible document schema based database such as MongoDB, a data store that makes up the virtual twin system. We present an architecture, which allows third-party integration of software apps to interface with the virtual manufacturing machines. We evaluate our database schema against query statements and provide examples of how third-party apps can interface with manufacturing machines using the VMM middleware. Finally, we discuss an operating system architecture for VMMs across the manufacturing cyberspace, which necessitates command and control of various virtualized manufacturing machines, opening new possibilities in cyber-physical systems in manufacturing. 相似文献
16.
17.
William J. Rasdorf Lisa K. Spainhour
Edward M. Patton
Bruce P. Burns 《Advances in Engineering Software》1993,16(3):145-152Integration among programs designed to solve complex engineering problems is often lacking, and this is particularly a problem in the area of thick composite materials analysis and design where a large volume of inout must be provided. Another problem is the lack of an archival repository in which to store input information in a generic format. To address these problems, we have developed a prototype of a finite element material property reprocessing system, called the composites data-base interface (CDI). In this computer-aided analysis system, a materials database is integrated with several software components, including commercially available finite element analysis (FEA) programs and preprocessors, and tools for manipulating and using composite materials data, resulting in the transfer of two- and three-dimensional composite materials property data into an FEA program. This paper presents the capabilities of this system, discusses the overall system integration through R:BASE, and provides a civil engineering application involving the design of a large cylindrical tank to illustrate the execution of the CDI system's various components. The paper ends by discussing the current status of this computer-aided analysis system. 相似文献
18.
Karla Geiss Martin Meyer Martin Radespiel-Trger Olaf Gefeller 《Computer methods and programs in biomedicine》2009,96(1):63-71
Long-term observed and relative survival are important outcome measures of cancer patient care reported routinely by many cancer registries, but no commercial statistical software exists for estimating relative survival or performing period survival analysis. The programs publicly available focus only on certain methods, require specific input data formats and often are macros or functions which require underlying software packages. Here we introduce SURVSOFT, a comprehensive, user-friendly Windows program with graphical user interface. It can handle different input data formats and incorporates a variety of nonparametric statistical methods for survival data analysis. SURVSOFT produces high-resolution graphs, which can be printed, saved or exported to be used with standard graphics editors. The use of SURVSOFT is illustrated by the analysis of survival data from the Bavarian Cancer Registry. 相似文献
19.
20.
《Environmental Modelling & Software》2000,15(4):357-372
When implemented as a computer program, an ecosystem model is only a part of a larger programming environment. This programming environment includes other programs, non-model program components, program design rules, data files, and associated analysis analytical tools. These components should be divided to allow programmers to focus on their areas of expertise, but must then be rejoined in such a way as to minimise debugging and execution overheads. We describe this larger programming environment as it surrounds a model of the ecosystem of Port Phillip Bay, Australia.The ecosystem model requires a transport model to allow spatial modelling; this transport model uses currents from a computationally intensive hydrodynamic model. Implementation of the ecosystem model also requires non-model code, such as routines to initialise parameters or the integration method. Their design determines program reliability and performance. A modular structure allows different parts of the model to be independently modified; this makes for efficient programming. We describe formal design rules used to enhance readability and information content of the model's parameter names. To execute, the model must access data files and a record of the run must be kept — a Unix shell program serves both these functions. The data files may require software tools for generation or manipulation. Output from the model also requires post-processing for visualisation and analysis. The model is thus only a part of a network of software, whose development must be coordinated to ensure reliability and efficiency. 相似文献