Buch. Zustand: Neu. Neuware - Models in system design follow the general tendency in electronics in terms of size, complexity and difficulty of maintenance. While a model should be a manageable representation of a system, this increasing complexity sometimes forces current CAD-tool designers and model writers to apply modeling techniques to the model itself. Model writers are interested in instrumenting their model, so as to extract critical information before the model is complete. CAD tools designers use internal representations of the design at various stages. The complexity has also led CAD-tool developers to develop formal tools, theories and methods to improve relevance, completeness and consistency of those internal representations. Information modeling involves the representation of objects, their properties and relationships. Performance Modeling When it comes to design choices and trade-offs, performance is generally the final key. However performance estimations have to be extracted at a very early stage in the system design. Performance modeling concerns the set of tools and techniques that allow or help the designer to capture metrics relating to future architectures. Performance modeling encompasses the whole system, including software modeling. It has a strong impact on all levels of design choices, from hardware/software partitioning to the final layout. Information Modeling Specification and formalism have in the past traditionally played little part in the design and development of EDA systems, their support environments, languages and processes. Instead, EDA system developers and EDA system users have seemed to be content to operate within environments that are often extremely complex and may be poorly tested and understood. This situation has now begun to change with the increasing use of techniques drawn from the domains of formal specification and database design. This section of this volume addresses aspects of the techniques being used. In particular, it considers a specific formalism, called information modeling, which has gained increasing acceptance recently and is now a key part of many of the proposals in the EDA Standards Roadmap, which promises to be of significance to the EDA industry. In addition, the section looks at an example of a design system from the point of view of its underlying understanding of the design process rather than through a consideration of particular CAD algorithms. Meta-Modeling: Performance and Information Modeling contains papers describing the very latest techniques used in meta-modeling. It will be a valuable text for researchers, practitioners and students involved in Electronic Design Automation.
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Buch. Zustand: Neu. Neuware - Modern economic growth is characterized by structural changes based on the introduction of new technologies into economics. The replacement and renova tion of technologies in industrial environments undergoing technical change is clearly one of the key aspects of economic development. The mathematical modeling of evolutionary economics under technical change (TC) has been rigorously considered by many authors during last decades. There is a wide variety of economic approaches and models describing different aspects of technical change. Among these are the models of embodied technical progress [19], [35], [70], [129], endogenous growth models [94], [102], the models of technological innovations [31], [32], [41], and others. The perspective self organization evolutionary approach is developed in [20], [38], [122], [123], [124], [126], which unites the aspects of diffusion of new technologies, technological and behavioral diversity of firms, learning mechanisms, age-dependent effects, and other important features of real-life economics. On the whole, an interest in evolutionary economics has brought considerable progress in the description and conceptualization of the sources, characteristics, direction and effects of technical change [125]. However, the modeling and control of technology lifetime under technical change has received rather little attention in mathematical economics in con trary to other aspects of technical progress. The lifetime of technologies has rarely been formally treated as a part of more general mathematical theory of economic dynamics. A problem which is still to be resolved consists in establishing the rational strategies of technologies' replacement under various assumptions on the behavior of technical change.
Buch. Zustand: Neu. Neuware - IFIP's Working Group 2.7(13.4)\* has, since its establishment in 1974, con centrated on the software problems of user interfaces. From its original interest in operating systems interfaces the group has gradually shifted em phasis towards the development of interactive systems. The group has orga nized a number of international working conferences on interactive software technology, the proceedings of which have contributed to the accumulated knowledge in the field. The current title of the Working Group is 'User Interface Engineering', with the aim of investigating the nature, concepts, and construction of user interfaces for software systems. The scope of work involved is: - to increase understanding of the development of interactive systems; - to provide a framework for reasoning about interactive systems; - to provide engineering models for their development. This report addresses all three aspects of the scope, as further described below. In 1986 the working group published a report (Beech, 1986) with an object-oriented reference model for describing the components of operating systems interfaces. The modelwas implementation oriented and built on an object concept and the notion of interaction as consisting of commands and responses. Through working with that model the group addressed a number of issues, such as multi-media and multi-modal interfaces, customizable in terfaces, and history logging. However, a conclusion was reached that many software design considerations and principles are independent of implemen tation models, but do depend on the nature of the interaction process.
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Buch. Zustand: Neu. Neuware - From the Foreword. Modern digital signal processing applications provide a large challenge to the system designer. Algorithms are becoming increasingly complex, and yet they must be realized with tight performance constraints. Nevertheless, these DSP algorithms are often built from many constituent canonical subtasks (e.g., IIR and FIR filters, FFTs) that can be reused in other subtasks. Design is then a problem of composing these core entities into a cohesive whole to provide both the intended functionality and the required performance. In order to organize the design process, there have been two major approaches. The top-down approach starts with an abstract, concise, functional description which can be quickly generated. On the other hand, the bottom-up approach starts from a detailed low-level design where performance can be directly assessed, but where the requisite design and interface detail take a long time to generate. In this book, the authors show a way to effectively resolve this tension by retaining the high-level conciseness of VHDL while parameterizing it to get good fit to specific applications through reuse of core library components. Since they build on a pre-designed set of core elements, accurate area, speed and power estimates can be percolated to high- level design routines which explore the design space. Results are impressive, and the cost model provided will prove to be very useful. Overall, the authors have provided an up-to-date approach, doing a good job at getting performance out of high-level design. The methodology provided makes good use of extant design tools, and is realistic in terms of the industrial design process. The approach is interesting in its own right, but is also of direct utility, and it will give the existing DSP CAD tools a highly competitive alternative. The techniques described have been developed within ARPAs RASSP (Rapid Prototyping of Application Specific SignalProcessors) project, and should be of great interest there, as well as to many industrial designers. Professor Jonathan Allen, Massachusetts Institute of Technology.
Buch. Zustand: Neu. Neuware - Recent Advances in Robot Learning contains seven papers on robot learning written by leading researchers in the field. As the selection of papers illustrates, the field of robot learning is both active and diverse. A variety of machine learning methods, ranging from inductive logic programming to reinforcement learning, is being applied to many subproblems in robot perception and control, often with objectives as diverse as parameter calibration and concept formulation. While no unified robot learning framework has yet emerged to cover the variety of problems and approaches described in these papers and other publications, a clear set of shared issues underlies many robot learning problems. Machine learning, when applied to robotics, is situated: it is embedded into a real-world system that tightly integrates perception, decision making and execution. Since robot learning involves decision making, there is an inherent active learning issue. Robotic domains are usually complex, yet the expense of using actual robotic hardware often prohibits the collection of large amounts of training data. Most robotic systems are real-time systems. Decisions must be made within critical or practical time constraints. These characteristics present challenges and constraints to the learning system. Since these characteristics are shared by other important real-world application domains, robotics is a highly attractive area for research on machine learning. On the other hand, machine learning is also highly attractive to robotics. There is a great variety of open problems in robotics that defy a static, hand-coded solution. Recent Advances in Robot Learning is an edited volume of peer-reviewed original research comprising seven invited contributions by leading researchers. This research work has also been published as a special issue of Machine Learning (Volume 23, Numbers 2 and 3).
Buch. Zustand: Neu. Neuware - Although adaptive filtering and adaptive array processing began with research and development efforts in the late 1950's and early 1960's, it was not until the publication of the pioneering books by Honig and Messerschmitt in 1984 and Widrow and Stearns in 1985 that the field of adaptive signal processing began to emerge as a distinct discipline in its own right. Since 1984 many new books have been published on adaptive signal processing, which serve to define what we will refer to throughout this book as conventional adaptive signal processing. These books deal primarily with basic architectures and algorithms for adaptive filtering and adaptive array processing, with many of them emphasizing practical applications. Most of the existing textbooks on adaptive signal processing focus on finite impulse response (FIR) filter structures that are trained with strategies based on steepest descent optimization, or more precisely, the least mean square (LMS) approximation to steepest descent. While literally hundreds of archival research papers have been published that deal with more advanced adaptive filtering concepts, none of the current books attempt to treat these advanced concepts in a unified framework. The goal of this new book is to present a number of important, but not so well known, topics that currently exist scattered in the research literature. The book also documents some new results that have been conceived and developed through research conducted at the University of Illinois during the past five years.
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Buch. Zustand: Neu. Neuware - Catheter-delivered therapeutic ultrasound angioplasty is a new technique for use in the treatment of obstructive vascular disease. The treatment differs from balloon angioplasty in that it has been shown experimentally to cause disintegration of calcific and fibrotic atherosclerotic plaques, thrombus dissolution and arterial vasodilation. In contrast to laser technology, ultrasound systems are relatively inexpensive and simple to use and maintain. In the clinical trials detailed in this text, ultrasound angioplasty has been shown to be feasible and safe. Ultrasound Angioplasty is a comprehensive text, addressing the theoretical, experimental and clinical issues. The international contributions reflect the excitement, interest, spirit and cooperation in the research and development of therapeutic ultrasound.
Buch. Zustand: Neu. Neuware - Teleservice is a common concept for distributed application services related to the use of telecommunication equipment, PCs, workstations and mainframes. Teleservices represent a diversity of applications related to various user and vendor cultures such as traditional telecommunications services, E-mail services, cooperative work, applications, multimedia applications, mobile services and intelligent network services. The complexity and diversity of teleservices are increasing, but of greater importance is the change in the way in which teleservices are designed, delivered and maintained. Information Network and Data Communications captures the cultural as well as the technical variety of teleservice.
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Buch. Zustand: Neu. Neuware - Input/Output in Parallel and Distributed Computer Systems has attracted increasing attention over the last few years, as it has become apparent that input/output performance, rather than CPU performance, may be the key limiting factor in the performance of future systems. This I/O bottleneck is caused by the increasing speed mismatch between processing units and storage devices, the use of multiple processors operating simultaneously in parallel and distributed systems, and by the increasing I/O demands of new classes of applications, like multimedia. It is also important to note that, to varying degrees, the I/O bottleneck exists at multiple levels of the memory hierarchy. All indications are that the I/O bottleneck will be with us for some time to come, and is likely to increase in importance. Input/Output in Parallel and Distributed Computer Systems is based on papers presented at the 1994 and 1995 IOPADS workshops held in conjunction with the International Parallel Processing Symposium. This book is divided into three parts. Part I, the Introduction, contains four invited chapters which provide a tutorial survey of I/O issues in parallel and distributed systems. The chapters in Parts II and III contain selected research papers from the 1994 and 1995 IOPADS workshops; many of these papers have been substantially revised and updated for inclusion in this volume. Part II collects the papers from both years which deal with various aspects of system software, and Part III addresses architectural issues. Input/Output in Parallel and Distributed Computer Systems is suitable as a secondary text for graduate level courses in computer architecture, software engineering, and multimedia systems, and as a reference for researchers and practitioners in industry.
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Buch. Zustand: Neu. Neuware - xiv box for Balanced Automation, research in this area is still young and emerging. In our opinion, the development of hybrid balanced solutions to cope with a variety of automation levels and manual approaches, is a much more challenging research problem than the search for a purely automatic solution. Various research activities described in this book illustrate some of these challenges through the development proposals, assisting tools, and initial results. In certain chapters however, the balancing aspects are not yet achieved in the research area, but their inclusion in this book is intended to give a broader and more comprehensive perspective of the multiple areas involved. One important aspect to be noticed is the extension and application of the concept of balanced automation to all areas of the manufacturing enterprise. Clearly, the need for a 'balanced' approach is not restricted to the shop floor components, rather it applies to all other areas, as illustrated by the wide spectrum of research contributions found in this book. For instance, the need for an appropriate integration of multiple systems and their perspectives is particularly important for the implantation of virtual enterprises. Although both the BASYS'95 and the BASYS'96 conferences have provided important contributions, approaches, and tools for the implantation of balanced automation systems, there are a number of areas that require further research: .
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Buch. Zustand: Neu. Neuware - The International Conference on the History of Original Ideas and Basic Discoveries, held at the 'Ettore Majorana' Centre for Scientific Culture in Erice, Sicily, July 27-August 4, 1994, brought together sixty of the leading scientists including many Nobel Laureates in high energy physics, principal contributors in other fields of physics such as high Tc superconductivity, particle accelerators and detector instrumentation, and thirty-six talented younger physicists selected from candidates throughout the world. The scientific program, including 49 lectures and a discussion session on the 'Status and Future Directions in High Energy Physics' was inspired by the conference theme: The key experimental discoveries and theoretical breakthroughs of the last 50 years, in particle physics and related fields, have led us to a powerful description of matter in terms of three quark and three lepton families and four fundamental interactions. The most recent generation of experiments at e+e- and proton-proton colliders, and corresponding advances in theoretical calculations, have given us remarkably precise determinations of the basic parameters of the electroweak and strong interactions. These developments, while showing the striking internal consistency of the Standard Model, have also sharpened our view of the many unanswered questions which remain for the next generation: the origin and pattern of particle masses and families, the unification of the interactions including gravity, and the relation between the laws of physics and the initial conditions of the universe.