The Emergence of Probability

A Philosophical Study of Early Ideas about Probability, Induction and Statistical Inference

Author: Ian Hacking

Publisher: Cambridge University Press

ISBN: 9780521685573

Category: History

Page: 209

View: 1028


Includes a new introduction, contextualizing his book in light of new work and philosophical trends.

Strategic Analysis Of Financial Markets, The (In 2 Volumes)

Author: Moffitt Steven D

Publisher: World Scientific Publishing Company

ISBN: 9813143770

Category: Business & Economics

Page: 1120

View: 3930


Volume 1 of "The Strategic Analysis of Financial Markets," — Framework, is premised on the belief that markets can be understood only by dropping the assumptions of rationality and efficient markets in their extreme forms, and showing that markets still have an inherent order and inherent logic. But that order results primarily from the "predictable irrationality" of investors, as well as from people's uncoordinated attempts to profit. The market patterns that result do not rely on rationality or efficiency. A framework is developed for understanding financial markets using a combination of psychology, statistics, game and gambling analysis, market history and the author's experience. It expresses analytically how professional investors and traders think about markets — as games in which other participants employ inferior, partially predictable strategies. Those strategies' interactions can be toxic and lead to booms, bubbles, busts and crashes, or can be less dramatic, leading to various patterns that are mistakenly called "market inefficiencies" and "stylized facts." A logical case is constructed, starting from two foundations, the psychology of human decision making and the "Fundamental Laws of Gambling." Applying the Fundamental Laws to trading leads to the idea of "gambling rationality" (grationality), replacing the efficient market's concept of "rationality." By classifying things that are likely to have semi-predictable price impacts (price "distorters"), one can identify, explore through data analysis, and create winning trading ideas and systems. A structured way of doing all this is proposed: the six-step "Strategic Analysis of Market Method." Examples are given in this and Volume 2. Volume 2 of "The Strategic Analysis of Financial Markets" — Trading System Analytics, continues the development of Volume 1 by introducing tools and techniques for developing trading systems and by illustrating them using real markets. The difference between these two Volumes and the rest of the literature is its rigor. It describes trading as a form of gambling that when properly executed, is quite logical, and is well known to professional gamblers and analytical traders. But even those elites might be surprised at the extent to which quantitative methods have been justified and applied, including a life cycle theory of trading systems. Apart from a few sections that develop background material, Volume 2 creates from scratch a trading system for Eurodollar futures using principles of the Strategic Analysis of Markets Method (SAMM), a principled, step-by-step approach to developing profitable trading systems. It has an entire Chapter on mechanical methods for testing and improvement of trading systems, which transcends the rather unstructured and unsatisfactory "backtesting" literature. It presents a breakout trend following system developed using factor models. It also presents a specific pairs trading system, and discusses its life cycle from an early, highly profitable period to its eventual demise. Recent developments in momentum trading and suggestions on improvements are also discussed.

The Software Arts

Author: Warren Sack

Publisher: Software Studies

ISBN: 0262039702

Category: Computers

Page: 400

View: 3518


An alternative history of software that places the liberal arts at the very center of software's evolution. In The Software Arts, Warren Sack offers an alternative history of computing that places the arts at the very center of software's evolution. Tracing the origins of software to eighteenth-century French encyclopedists' step-by-step descriptions of how things were made in the workshops of artists and artisans, Sack shows that programming languages are the offspring of an effort to describe the mechanical arts in the language of the liberal arts. Sack offers a reading of the texts of computing--code, algorithms, and technical papers--that emphasizes continuity between prose and programs. He translates concepts and categories from the liberal and mechanical arts--including logic, rhetoric, grammar, learning, algorithm, language, and simulation--into terms of computer science and then considers their further translation into popular culture, where they circulate as forms of digital life. He considers, among other topics, the "arithmetization" of knowledge that presaged digitization; today's multitude of logics; the history of demonstration, from deduction to newer forms of persuasion; and the post-Chomsky absence of meaning in grammar. With The Software Arts, Sack invites artists and humanists to see how their ideas are at the root of software and invites computer scientists to envision themselves as artists and humanists.

Statistics and Probability in High School

Author: Carmen Batanero,Manfred Borovcnik

Publisher: Springer

ISBN: 9463006249

Category: Education

Page: 224

View: 7323


Statistics and probability are fascinating fields, tightly interwoven with the context of the problems which have to be modelled. The authors demonstrate how investigations and experiments provide promising teaching strategies to help high-school students acquire statistical and probabilistic literacy. In the first chapter the authors put into practice the following educational principles, reflecting their views of how these subjects should be taught: a focus on the most relevant ideas and postpone extensions to later stages; illustrating the complementary/dual nature of statistical and probabilistic reasoning; utilising the potential of technology and show its limits; and reflecting on the different levels of formalisation to meet the wide variety of students’ previous knowledge, abilities, and learning types. The remaining chapters deal with exploratory data analysis, modelling information by probabilities, exploring and modelling association, and with sampling and inference. Throughout the book, a modelling view of the concepts guides the presentation. In each chapter, the development of a cluster of fundamental ideas is centred around a statistical study or a real-world problem that leads to statistical questions requiring data in order to be answered. The concepts developed are designed to lead to meaningful solutions rather than remain abstract entities. For each cluster of ideas, the authors review the relevant research on misconceptions and synthesise the results of research in order to support teaching of statistics and probability in high school. What makes this book unique is its rich source of worked-through tasks and its focus on the interrelations between teaching and empirical research on understanding statistics and probability.

Interpreting Probability

Controversies and Developments in the Early Twentieth Century

Author: David Howie

Publisher: Cambridge University Press

ISBN: 9781139434379

Category: Science

Page: N.A

View: 1174


The term probability can be used in two main senses. In the frequency interpretation it is a limiting ratio in a sequence of repeatable events. In the Bayesian view, probability is a mental construct representing uncertainty. This 2002 book is about these two types of probability and investigates how, despite being adopted by scientists and statisticians in the eighteenth and nineteenth centuries, Bayesianism was discredited as a theory of scientific inference during the 1920s and 1930s. Through the examination of a dispute between two British scientists, the author argues that a choice between the two interpretations is not forced by pure logic or the mathematics of the situation, but depends on the experiences and aims of the individuals involved. The book should be of interest to students and scientists interested in statistics and probability theories and to general readers with an interest in the history, sociology and philosophy of science.

Reader's Guide to the History of Science

Author: Arne Hessenbruch

Publisher: Routledge

ISBN: 1134262949

Category: History

Page: 965

View: 6877


The Reader's Guide to the History of Science looks at the literature of science in some 550 entries on individuals (Einstein), institutions and disciplines (Mathematics), general themes (Romantic Science) and central concepts (Paradigm and Fact). The history of science is construed widely to include the history of medicine and technology as is reflected in the range of disciplines from which the international team of 200 contributors are drawn.

Public Relations Review

A Journal of Research and Comment

Author: Foundation for Public Relations Research and Education (U.S.)

Publisher: N.A


Category: Marketing

Page: N.A

View: 9375


The evidential foundations of probabilistic reasoning

Author: David A. Schum

Publisher: Wiley-Interscience


Category: Mathematics

Page: 545

View: 3842


From Holmes's analysis of footprints and tobacco ash to modern institutional DNA testing, evidence has formed the cornerstone of probabilistic reasoning, both in fiction and real life. Too often viewed as irrefutable, evidence, argues David Schum, is an interpretive science, refracted through the varying perspectives of subject specialty. Evaluating how evidence is discovered, arranged, and used is essential not only for drawing conclusions, but also for developing an analytical scheme that transcends the particular skew of individual disciplines. In the first textbook treatment of evidence as a science, Evidential Foundations of Probabilistic Reasoning examines inferences drawn from evidence that is incomplete, inconclusive, and often imprecise. Layer by layer, the book disassembles the process of gathering, organizing, and evaluating evidence, activities that ultimately affect what conclusions are drawn from evidence and how new evidence is discovered. The book also presents a balanced account of the probabilistic process of assessing the force, strength, or weight of evidence, an examination that considers the many current views on evaluating evidence. A subject of growing interest and study, the imaginative reasoning process behind the discovery or generation of new evidence and new hypotheses, is also described. Featuring over one hundred numerical examples to illustrate the workings of various probabilistic expressions, as well as lively graphics which illuminate many of the evidential and inferential issues discussed, this is an essential working reference to every facet of the science of evidence.

Information-gap Decision Theory

Decisions Under Severe Uncertainty

Author: Yakov Ben-Haim

Publisher: N.A

ISBN: 9780120882519

Category: Technology & Engineering

Page: 330

View: 6767


Information-Gap Decision Theory presents a distinctive new theory of decision-making under severe uncertainty. Applications in engineering design and analysis, project management, economics, strategic planning, social decision making, environmental management, medical decisions, search and evasion problems, risk assessment, and other areas are discussed. Info-gap theory deals with many of the problems and questions of classical decision analysis such as risk assessment, gambling, value of information, trade-off analysis, and preference reversal, but the distinctive character of info-gap uncertainty repeatedly gives rise to new insights and unique decision algorithms. Furthermore, this book deals with many of the difficult interface issues facing the responsible decision maker such as value judgments concerning risk and immunity to failure, as well as philosophical implications of decision under uncertainty. This book is a fresh approach to the age-old problem of deciding responsibly with deficient information. An info-gap is the disparity between what is known and what needs to be known in order to make a well-founded decision. The book begins with a discussion of info-gap models of uncertainty, which provides an innovative approach to the quantification of severe lack of information. This book can be used in advanced undergraduate and graduate courses on decision theory and risk analysis. It is also of interest to practicing decision analysts and to researchers in decision theory and in human decision-making.

The Taming of Chance

Author: Ian Hacking

Publisher: Cambridge University Press

ISBN: 1107650712

Category: Political Science

Page: 284

View: 404


In this important study Ian Hacking continues the enquiry into the origins and development of certain characteristic modes of contemporary thought undertaken in such previous works as the best-selling The Emergence of Probability. Professor Hacking shows how by the late-nineteenth century it became possible to think of statistical patterns as explanatory in themselves, and to regard the world as not necessarily deterministic in character. In the same period the idea of human nature was displaced by a model of normal people with laws of dispersion. These two parallel transformations fed into each other, so that chance made the world seem less capricious: it was legitimated because it brought order out of chaos. Combining detailed scientific historical research with characteristic philosophic breadth and verve, The Taming of Chance brings out the relations between philosophy, the physical sciences, mathematics and the development of social institutions, and provides a unique and authoritative analysis of the 'probabilisation' of the western world.

Encyclopedia of Statistical Sciences

Author: Samuel Kotz

Publisher: Wiley-Interscience

ISBN: 9780471743767

Category: Mathematics

Page: 680

View: 9913


Countless professionals and students who use statistics in their work rely on the multi-volume Encyclopedia of Statistical Sciences as a superior and unique source of information on statistical theory, methods, and applications. This new edition (available in both print and on-line versions) is designed to bring the encyclopedia in line with the latest topics and advances made in statistical science over the past decade--in areas such as computer-intensive statistical methodology, genetics, medicine, the environment, and other applications. Written by over 600 world-renowned experts (including the editors), the entries are self-contained and easily understood by readers with a limited statistical background. With the publication of this second edition in 16 printed volumes, the Encyclopedia of Statistical Sciences retains its position as a cutting-edge reference of choice for those working in statistics, biostatistics, quality control, economics, sociology, engineering, probability theory, computer science, biomedicine, psychology, and many other areas. The Encyclopedia of Statistical Sciences is also available as a 16 volume A to Z set. Volume 6: In-L.

Grundbegriffe der Wahrscheinlichkeitsrechnung

Author: A. Kolomogoroff

Publisher: Springer-Verlag

ISBN: 3642498884

Category: Mathematics

Page: 62

View: 5978


Dieser Buchtitel ist Teil des Digitalisierungsprojekts Springer Book Archives mit Publikationen, die seit den Anfängen des Verlags von 1842 erschienen sind. Der Verlag stellt mit diesem Archiv Quellen für die historische wie auch die disziplingeschichtliche Forschung zur Verfügung, die jeweils im historischen Kontext betrachtet werden müssen. Dieser Titel erschien in der Zeit vor 1945 und wird daher in seiner zeittypischen politisch-ideologischen Ausrichtung vom Verlag nicht beworben.