Buch. Zustand: Neu. Neuware - conditions of the possibility of Experience . must mean nothing else than all that which lies immanently in the essence of Experience . and therefore belongs to it indispensably. The essence of Experience that phenomenological analysis of Experience elucidates is the same as the possibility of Experience, and all that which is determined in the essence, in the possibility of Experience, is eo ipso 1 condition of the possibility of Experience. Through acquaintance with Husserl's work, then, I developed my way of understand ing what, according to their very possibility, lies in conscious activities of mentally representing something, for example, by imagining or remembering it, or by viewing it in a picture, all these understood as forms of modified perception. As Husserl himself made clear, such reflective and descriptive analyses of the mental activities according to their very possibility are carried out regardless of the way they have actually come to be. However, I was also interested in developmen tal questions, especially with regard to the activity of imagining. Hence I turned to cognitive developmental psychology in order to get acquainted with the neces sary empirical material. Moreover, I conducted a pilot-study with young children that I had conceived according to phenomenologically relevant aspects concerning the difference and yet inner connection of the activities of imagining and viewing 2 pictures.
Buch. Zustand: Neu. Neuware - Quality is the major topic in international industry today, and its importance will increase in the 1990s. Making Quality Happen presents a common sense, step-by-step approach for implementing a Quality Improvement Process in any type of organization. The book utilizes 'Quality' as the strategic weapon that will help an organization achieve its overall objective by improving the organization's customer impact, reducing its cost structure, increasing its competitive market share, and maximizing its employee productivity. Importantly, the Quality Improvement Process detailed by McNealy makes Quality an integral part of any organization, and not an added or extraneous feature. Making Quality Happen is targeted at a broad audience of managers in all types of organizations around the world. The concepts and recommended actions expounded are directly applicable to, and have been implemented successfully in, large and small organizations in the public, private and non-profit sectors. This is a 'hands-on', action-oriented, instructive guide to implementing a Quality Improvement Effort. It is not a theoretical, overly technical, or academic treatise. Rather, it is a proven recipe for winning the Quality revolution.
Buch. Zustand: Neu. Neuware - Formal verification means having a mathematical model of a system, a language for specifying desired properties of the system in a concise, comprehensible and unambiguous way, and a method of proof to verify that the specified properties are satisfied. When the method of proof is carried out substantially by machine, we speak of automatic verification. Symbolic Model Checking deals with methods of automatic verification as applied to computer hardware. The practical motivation for study in this area is the high and increasing cost of correcting design errors in VLSI technologies. There is a growing demand for design methodologies that can yield correct designs on the first fabrication run. Moreover, design errors that are discovered before fabrication can also be quite costly, in terms of engineering effort required to correct the error, and the resulting impact on development schedules. Aside from pure cost considerations, there is also a need on the theoretical side to provide a sound mathematical basis for the design of computer systems, especially in areas that have received little theoretical attention.
Buch. Zustand: Neu. Neuware - ''A valuable resource for those concerned with experimental teratology and risk assessment and those requiring general information about the causes of birth defects. The treatment of these issues is sophisticated, succinct, and logical.'' --- American Scientist , from a review of a previous volume The current volume covers intergenerational factors in pregnancy outcome, the thresholds for developmental toxicants, and four other subjects.
Buch. Zustand: Neu. Neuware - The current work, the first of a two volume set, is devoted to the relevance of history to psychological theory, and the manner in which both may be worked into an empirical framework.
Buch. Zustand: Neu. Neuware - 1 Introduction.- 1.1 The compass of taxonomy and systematics.- 1.2 The 1960s and the emergence of new ideas.- 1.3 Cladistics and numerical taxonomy: the conflict.- 1.4 Assumptions and philosophy of cladistics and the use of parsimony criteria.- 1.5 Taxonomy and the comparative method in biology.- 2 Characters, Taxa and Species.- 2.1 Nature and handling of data.- 2.2 Characters.- 2.2.1 Discrete coding of continuous characters and ratios.- 2.2.2 Identifying primitive and advanced character states.- 2.2.3 Homoplasy: convergence, parallelisms and reversals.- 2.2.4 Homology versus analogy.- 2.2.5 Character state transitions.- 2.2.6 Dealing with missing data and polymorphic characters.- 2.3 Classes of characters requiring special consideration.- 2.3.1 Characters subject to strong selection pressures.- 2.3.2 Environmental effects.- 2.3.3 Molecular sequence characters.- 2.3.4 Electron microscopy and the use of microcharacters.- 2.3.5 Colour as a taxonomic character.- 2.3.6 Cryptic and internal characters.- 2.3.7 Animal artefacts.- 2.3.8 Behavioural characters.- 2.4 Taxa and species concepts.- 2.4.1 Phylogenetic groups: monophyly, polyphyly and paraphyly.- 2.5 What is a species .- 2.5.1 Biological species concept.- 2.5.2 Phvlogenetic species concept.- 2.5.3 Evolutionary species concept.- 2.5.4 Problems with parthenogenetic species and asexual clones - some further considerations.- 3 Phylogenetic Reconstruction - Cladistics and Related Methods.- 3.1 Cladistics and cladograms.- 3.1.1 Parsimony.- 3.1.2 Compatibility analysis.- 3.1.3 Maximum likelihood and related methods.- 3.2 Parsimony and finding the shortest trees.- 3.2.1 Finding the shortest trees and the impact of computerization.- 3.2.2 Tree facts and figures.- 3.2.3 Building trees from distance data.- 3.2.4 Rooting trees.- 3.2.5 Consistency and other indices.- 3.2.6 Weighting characters.- 3.2.7 Coping with multiple trees.- 3.2.8 Consensus trees.- 3.2.9 Comparing trees.- 3.3 Which method - an overview.- 3.3.1 How well does parsimony analysis estimate trees .- 3.3.2 Compatibility versus parsimony.- 3.3.3 Congruence between data sets (or how do we know when to believe a phylogeny ).- 3.3.4 Reticulate evolution, hybrids and intraspecific evolution.- 3.4 Cladistics and classification.- 4 Phenetic Methods in Taxonomy.- 4.1 Introduction.- 4.1.1 Similarity and distance measures.- 4.1.2 Measures using binary characters.- 4.1.3 Distance and similarity measures using continuous data.- 4.2 Analysing similarity and distance data.- 4.3 Hierarchic clustering procedures.- 4.3.1 Nearest neighbour clustering.- 4.3.2 Furthest neighbour (complete linkage).- 4.3.3 Unweighted pair-group method using arithmetic averages (UPGMA).- 4.3.4 Weighted pair-group method using arithmetic averages (WPGMA).- 4.3.5 Centroid clustering.- 4.4 Ordination methods.- 4.4.1 Principal components analysis.- 4.4.2 Principal coordinate analysis.- 4.4.3 Canonical variate analysis.- 4.4.4 Non-metric multidimensional scaling.- 5 Keys and Identification.- 5.1 Introduction.- 5.1.1 Purpose of keys.- 5.1.2 Good practice in writing keys.- 5.2 Types of keys.- 5.2.1 Dichotomous keys.- 5.2.2 Multiple-entry keys.- 5.3 Efficiency.- 5.3.1 Length of dichotomous keys.- 5.3.2 Reliability.- 5.3.3 Choice of characters.- 5.3.4 Likelihood of encountering taxon.- 5.4 Computerized key construction.- 5.4.1 Interactive identification.- 5.4.2 Matching.- 5.4.3 Automated taxon descriptions.- 5.4.4 Databases.- 6 Nomenclature and Classification.- 6.1 Introduction.- 6.2 The binomial system and the hierarchy of taxa.- 6.3 The International Commissions.- 6.3.1 Codes of nomenclature.- 6.3.2 Independence of the Codes.- 6.4 Basic principles of nomenclature.- 6.4.1 Priority.- 6.4.2 Synonymy.- 6.4.3 Homonymy.- 6.4.4 The type concept.- 6.5 Miscellaneous group-related factors.- 6.5.1 Animals and animal-like Protista.- 6.5.2 Plants and plant-like Protista.- 6.5.3 Fungi.- 6.5.4 Lichens.- 6.5.5 'Blue-green algae' (Cyanophyta versus Cyanobacteria).- 6.5.6 Bacteria a.
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Buch. Zustand: Neu. Neuware - A Methodology Called 'MINK' for Study of Climate Change Impacts and Responses on the Regional Scale. An Introductory Editorial.- Paper 1. The Mink Methodology: Background and Baseline.- Paper 2. Agricultural Impacts of and Responses to Climate Change in the Missouri-Iowa-Nebraska-Kansas (MINK) Region.- Paper 3. Impacts and Responses to Climate Change in Forests of the MINK Region.- Paper 4. Climate Change Impacts on Water Resources and Possible Responses in the MINK Region.- Paper 5. Climate Change Impacts on the Energy Sector and Possible Adjustments in the MINK Region.- Paper 6. Consequences of Climate Change for the MINK Economy: Impacts and Responses.- An Overview of the MINK Study.
Buch. Zustand: Neu. Neuware - Multiprocessing: Trade-Offs in Computation and Communication presents an in-depth analysis of several commonly observed regular and irregular computations for multiprocessor systems. This book includes techniques which enable researchers and application developers to quantitatively determine the effects of algorithm data dependencies on execution time, on communication requirements, on processor utilization and on the speedups possible. Starting with simple, two-dimensional, diamond-shaped directed acyclic graphs, the analysis is extended to more complex and higher dimensional directed acyclic graphs. The analysis allows for the quantification of the computation and communication costs and their interdependencies. The practical significance of these results on the performance of various data distribution schemes is clearly explained. Using these results, the performance of the parallel computations are formulated in an architecture independent fashion. These formulations allow for the parameterization of the architecture specitific entities such as the computation and communication rates. This type of parameterized performance analysis can be used at compile time or at run-time so as to achieve the most optimal distribution of the computations. The material in Multiprocessing: Trade-Offs in Computation and Communication connects theory with practice, so that the inherent performance limitations in many computations can be understood, and practical methods can be devised that would assist in the development of software for scalable high performance systems.
Buch. Zustand: Neu. Neuware - Parsing technology traditionally consists of two branches, which correspond to the two main application areas of context-free grammars and their generalizations. Efficient deterministic parsing algorithms have been developed for parsing programming languages, and quite different algorithms are employed for analyzing natural language. The Functional Treatment of Parsing provides a functional framework within which the different traditional techniques are restated and unified. The resulting theory provides new recursive implementations of parsers for context-free grammars. The new implementations, called recursive ascent parsers, avoid explicit manipulation of parse stacks and parse matrices, and are in many ways superior to conventional implementations. They are applicable to grammars for programming languages as well as natural languages. The book has been written primarily for students and practitioners of parsing technology. With its emphasis on modern functional methods, however, the book will also be of benefit to scientists interested in functional programming. The Functional Treatment of Parsing is an excellent reference and can be used as a text for a course on the subject.
Buch. Zustand: Neu. Neuware - Dean Pomerleau's trainable road tracker, ALVINN, is arguably the world's most famous neural net application. It currently holds the world's record for distance traveled by an autonomous robot without interruption: 21.2 miles along a highway, in traffic, at speedsofup to 55 miles per hour. Pomerleau's work has received worldwide attention, including articles in Business Week (March 2, 1992), Discover (July, 1992), and German and Japanese science magazines. It has been featured in two PBS series, 'The Machine That Changed the World' and 'By the Year 2000,' and appeared in news segments on CNN, the Canadian news and entertainment program 'Live It Up', and the Danish science program 'Chaos'. What makes ALVINN especially appealing is that it does not merely drive - it learns to drive, by watching a human driver for roughly five minutes. The training inputstothe neural networkare a video imageoftheroad ahead and thecurrentposition of the steering wheel. ALVINN has learned to drive on single lane, multi-lane, and unpaved roads. It rapidly adapts to other sensors: it learned to drive at night using laser reflectance imaging, and by using a laser rangefinder it learned to swerve to avoid obstacles and maintain a fixed distance from a row of parked cars. It has even learned to drive backwards.
Buch. Zustand: Neu. Neuware - In transfusion medicine the scientific fundamentals of immunology have had a considerable clinical impact. Transfusion may suppress the immunity but some patients could suffer disadvantages including GvHD, alloimmunisation and possible cancer, where white cells (WBC) play pivotal roles in this phenomenon, presenting antigens and producing cytokines. A clinical application of this practice is LAK-cells targeted against cancer. MHC on the WBC may provide additional immunological modulations through series of secondary messengers. Thus reduction of WBC in the blood and bone marrow may be advantageous for patients. On the other hand, sharing a part of MHC or making the transplanted white cells anergic by storage may be even more advantageous for patients. CMV infection could mimic part of this MHC. UV radiation is effective in the inactivation of the WBC although filters are easy means for such removal. However, their accurate quantification requires flow cytometry that has considerable potential application in blood transfusions. Idiotypic antibody could play an important role in platelet theory. However, the potential infection risks in transfusion like HIV and HCV remain, but application of molecular biological methods like PCR or RT/PCR has great potentials in detection of infectious diseases, transplantation and genetic disorders. Immuno affinity purified concentrates, like factor IX and protein C, could reduce patients' immune functions, where in the future protein C could be derived from transgenic animals. Advances are sure to emerge through adoptive immunotherapy and gene therapies are exciting prospects when genes transferred into lymphocytes could be used to correct cell mediated immune deficiency, as in ADA.
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Buch. Zustand: Neu. Neuware - Scientific research is viewed as a deliberate activity and the logic of discovery consists of strategies and arguments whereby the best objectives (questions) and optimal means for achieving these objectives (heuristics) are chosen. This book includes a discussion and some proposals regarding the way the logic of questions can be applied to understanding scientific research and draws upon work in artificial intelligence in a discussion of heuristics and methods for appraising heuristics (metaheuristics). It also includes a discussion of a third source for scientific objectives and heuristics; episodes and examplars from the history of science and the history of philosophy. This book is written to be accessible to advanced students in philosophy and to the scientific community. It is of interest to philosophers of science, philosophers of biology, historians of physics, and historians of biology.
Buch. Zustand: Neu. Neuware - Scanning near-field optical microscopy (SNOM, also known as NSOM) is a new local probe technique with a resolving power of 10--50 nm. Not being limited by diffraction, near-field optics (NFO) opens new perspectives for optical characterization and the understanding of optical phenomena, in particular in biology, microelectronics and materials science. SNOM, after first demonstrations in '83/'84, has undergone a rapid development in the past two to four years. The increased interest has been largely stimulated by the wealth of optical properties that can be investigated and the growing importance of characterization on the nanometer scale in general. Examples include the use of fluorescence, birefrigence and plasmon effects for applications in particular in biology, microelectronics and materials science, to name just a few. This volume emerged from the first international meeting devoted exclusively to NFO, and comprises a complete survey of the 1992 activities in the field, in particular the variety of instrumental techniques that are currently being explored, the demonstration of the imaging capabilities as well as theoretical interpretations - a highly nontrivial task. The comprehensive collection of papers devoted to these and related subjects make the book a valuable tool for anybody interested in near-field optics.
Buch. Zustand: Neu. Neuware - The Royal Society has initiated a series of meetings to discuss the effect advances in technology will have on our way of life in the next century. The two previous meetings have been concerned with housing and waste treat ment. The subject of the third meeting, communications, is no less critical to life, but it offers particular problems and uncertainties, especially in the forecasting of future trends. Indeed, some have doubted if there can be profitable debate on long-term development in such a fast-moving field. The importance of the topic justifies an attempt, and the reader will judge whether the authors have met the challenge. Communications today bears little resemblance to that of the 1970s. Then we knew about satellites and optical fibres, and we had seen lasers and silicon chips, but most of us could never imagine the potential of the new technologies within our grasp. We had also not assessed the thirst of the popUlation for more and better ways of talking and writing to each other. It was the combination of market need and technical capability that created the com munications revolution.
Buch. Zustand: Neu. Neuware - Effective Polynomial Computation is an introduction to the algorithms of computer algebra. It discusses the basic algorithms for manipulating polynomials including factoring polynomials. These algorithms are discussed from both a theoretical and practical perspective. Those cases where theoretically optimal algorithms are inappropriate are discussed and the practical alternatives are explained. Effective Polynomial Computation provides much of the mathematical motivation of the algorithms discussed to help the reader appreciate the mathematical mechanisms underlying the algorithms, and so that the algorithms will not appear to be constructed out of whole cloth. Preparatory to the discussion of algorithms for polynomials, the first third of this book discusses related issues in elementary number theory. These results are either used in later algorithms (e.g. the discussion of lattices and Diophantine approximation), or analogs of the number theoretic algorithms are used for polynomial problems (e.g. Euclidean algorithm and p-adic numbers). Among the unique features of Effective Polynomial Computation is the detailed material on greatest common divisor and factoring algorithms for sparse multivariate polynomials. In addition, both deterministic and probabilistic algorithms for irreducibility testing of polynomials are discussed.
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Buch. Zustand: Neu. Neuware - Colorectal cancer is a collective term for a heterogeneous group of diseases. In a large proportion of cases, the condition is attributable to genetic predisposition. Those directly involved in the treatment of patients with cancer of the large bowel are confronted to an increasing degree with the genetic aspects of the disease. In familial and hereditary forms of the disorder periodic screening of the close relatives of the patients can in principle prevent disease and death from colorectal cancer. Presymptomatic diagnosis by means of DNA technology is now possible in many cases of familial adenomatous polyposis. Genetic diagnosis will be increasingly important for the identification of high-risk groups. This book summarizes those aspects of the genetics of colorectal cancer that are important for clinical pracice. It has been stated that clinicians can contribute to the goal of reducing mortality from cancer by asking each patient about his or her family history of cancer. The aim of this book is to provide a guideline for the management of those situations in which the family history of colorectal cancer is found to be positive.
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Buch. Zustand: Neu. Neuware - The previous volume of this series on soft tissue sarcomas highlighted the importance of the multidisciplinary approach to treatment, the focus of which is continued in the present edition. Proper diagnosis and staging remain the cornerstone of the treatment strategy. Sophisticated histopathology techniques and growing consensus on grading systems have further increased the importance of the histopathologist in providing estimates of the prognosis of the patient as well as providing data for the planning of treatment strategy. The use of cytogenetics is relatively new in this field. This might enable the distinction of subgroups in specific histological tumor types. Furthermore, molecular biological studies not only help to reveal inherited predispositions and details in oncogenesis in tumor development, but they may also provide additional predictive factors for tumor behavior. Further data on treatment strategy will be provided by diagnostic imaging, a field in which the role of magnetic resonance imaging is rapidly developing. As far as actual treatment is concerned, surgery still provides the major chance for cure. In view of the endeavor to be as sparing as possible, the addition of radiotherapy to surgery is of utmost importance. Usually radiotherapy is given after surgery, but the optimal sequence of the two modalities still needs to be defined. The combined use of surgery with radiotherapy and/or chemotherapy does have an impact on wound healing.
Buch. Zustand: Neu. Neuware - - . . . . At last the doctor will be freed from the tedious interpretation of screens and photographs. Instead, he will examine and scan through his patient directly. Wearing optical-shutter spectacles and aiming a pulsed laser torch, he will be able to peer at the beating heart, study the movement of a joint or the flexing of a muscle, press on suspect areas to see how the organs beneath respond, check that pills have been correctly swallowed or that an implant is savely in place, and so on. A patient wearing white cotton or nylon clothes that scatter but hardly absorb light, may not even have to undress . . . . -. David Jones, Nature (1990) 348:290 Optical imaging of the brain is a rapidly growing field of heterogenous techniques that has attracted considerable interest recently due to a number of theoretical advantages in comparison with other brain imaging modalities: it uses non ionizing radiation, offers high spatial and temporal resolution, and supplies new types of metabolic and functional information. From a practical standpoint it is important that bedside examinations seem feasible and that the implementations will be considerably less expensive compared with competing techniques. In October 1991, a symposium was held at the Eibsee near Garmisch, Germany to bring together the leading scientists in this new field.
Buch. Zustand: Neu. Neuware - A quarter of the century has elapsed since I gave my first course in structural reliability to graduate students at the University of Waterloo in Canada. Since that time on I have given many courses and seminars to students, researchers, designers, and site engineers interested in reliability. I also participated in and was responsible for numerous projects where reliability solutions were required. During that period, the scope of structural reliability gradually enlarged to become a substantial part of the general reliability theory. First, it is apparent that bearing structures should not be isolated objectives of interest, and, consequently, that constntCted facilities should be studied. Second, a new engineering branch has emerged -reliability engineering. These two facts have highlighted new aspects and asked for new approaches to the theory and applications. I always state in my lectures that the reliability theory is nothing more than mathematized engineering judgment. In fact, thanks mainly to probability and statistics, and also to computers, the empirical knowledge gained by Humankind's construction experience could have been transposed into a pattern of logic thinking, able to produce conclusions and to forecast the behavior of engineering entities. This manner of thinking has developed into an intricate network linked by certain rules, which, in a way, can be considered a type of reliability grammar. We can discern many grammatical concepts in the general structure of the reliability theory.
Buch. Zustand: Neu. Neuware - The modern system-wide approach to applied demand analysis emphasizes a unity between theory and applications. Its fIrm foundations in economic theory make it one of the most impressive areas of applied econometrics. This book presents a large number of applications of recent innovations in the area. The database used consist of about 18 annual observations for 10 commodities in 18 OECO countries (more than 3,100 data points). Such a large body of data should provide convincing evidence, one way or the other, about the validity of consumption theory. A PREVIEW OF THE BOOK The overall importance of the analysis presented in the book can be seen from the following table which shows the signifIcant contribution of the OECO to the world economy. As can be seen, the 24 member countries account for about 50 percent of world GOP in 1975. In this book we present an extensive analysis of the consumption patterns of the OECO countries.
Buch. Zustand: Neu. Neuware - ZBIGNIEW OZIEWICZ University of Wroclaw, Poland December 1992 The First Max Born Symposium in Theoretical and Mathematical Phy sics, organized by the University of Wrodaw, was held in September 1991 with the intent that it would become an annual event. It is the outgrowth of the annual Seminars organized jointly since 1972 with the University of Leipzig. The name of the Symposia was proposed by Professor Jan Lopu szanski. Max Born, an outstanding German theoretical physicist, was born in 1883 in Breslau (the German name of Wrodaw) and educated here. The Second Max Born Symposium was held during the four days 24- 27 September 1992 in an old Sobotka Castle 30 km west of Wrodaw. The Sobotka Castle was built in the eleventh century. The dates engraved on the walls of the Castle are 1024, 1140, and at the last rebuilding, 1885. The castle served as a cloister until the end of the sixteenth century.
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Buch. Zustand: Neu. Neuware - The Twelfth International Workshop on Maximum Entropy and Bayesian Methods in Sciences and Engineering (MaxEnt 92) was held in Paris, France, at the Centre National de la Recherche Scientifique (CNRS), July 19-24, 1992. It is important to note that, since its creation in 1980 by some of the researchers of the physics department at the Wyoming University in Laramie, this was the second time that it took place in Europe, the first time was in 1988 in Cambridge. The two specificities of MaxEnt workshops are their spontaneous and informal charac ters which give the participants the possibility to discuss easily and to make very fruitful scientific and friendship relations among each others. This year's organizers had fixed two main objectives: i) to have more participants from the European countries, and ii) to give special interest to maximum entropy and Bayesian methods in signal and image processing. We are happy to see that we achieved these objectives: i) we had about 100 participants with more than 50 per cent from the European coun tries, ii) we received many papers in the signal and image processing subjects and we could dedicate a full day of the workshop to the image modelling, restoration and recon struction problems.