|
Showing 1 - 14 of
14 matches in All Departments
A number of widely used contemporary processors have
instruction-set extensions for improved performance in multi-media
applications. The aim is to allow operations to proceed on multiple
pixels each clock cycle. Such instruction-sets have been
incorporated both in specialist DSPchips such as the Texas C62xx
(Texas Instruments, 1998) and in general purpose CPU chips like the
Intel IA32 (Intel, 2000) or the AMD K6 (Advanced Micro Devices,
1999). These instruction-set extensions are typically based on the
Single Instruc tion-stream Multiple Data-stream (SIMD) model in
which a single instruction causes the same mathematical operation
to be carried out on several operands, or pairs of operands, at the
same time. The level or parallelism supported ranges from two
floating point operations, at a time on the AMD K6 architecture to
16 byte operations at a time on the Intel P4 architecture. Whereas
processor architectures are moving towards greater levels of
parallelism, the most widely used programming languages such as C,
Java and Delphi are structured around a model of computation in
which operations takeplace on a single value at a time. This was
appropriate when processors worked this way, but has become an
impediment to programmers seeking to make use of the performance
offered by multi-media instruction -sets. The introduction of SIMD
instruction sets (Peleg et al."
A sweeping history of the full range of human labor Few authors are
able to write cogently in both the scientific and the economic
spheres. Even fewer possess the intellectual scope needed to
address science and economics at a macro as well as a micro level.
But Paul Cockshott, using the dual lenses of Marxist economics and
technological advance, has managed to pull off a stunningly acute
critical perspective of human history, from pre-agricultural
societies to the present. In How the World Works, Cockshott
connects scientific, economic, and societal strands to produce a
sweeping and detailed work of historical analysis. This book will
astound readers of all backgrounds and ages; it will also will
engage scholars of history, science, and economics for years to
come.
|
Classical Econophysics (Hardcover)
Allin F. Cottrell, Paul Cockshott, Gregory John Michaelson, Ian P. Wright, Victor Yakovenko
|
R4,437
Discovery Miles 44 370
|
Ships in 12 - 17 working days
|
This monograph examines the domain of classical political
economy using the methodologies developed in recent years both by
the new discipline of econo-physics and by computing science. This
approach is used to re-examine the classical subdivisions of
political economy: production, exchange, distribution and
finance.
The book begins by examining the most basic feature of economic
life production and asks what it is about physical laws that allows
production to take place. How is it that human labour is able to
modify the world? It looks at the role that information has played
in the process of mass production and the extent to which human
labour still remains a key resource. The Ricardian labour theory of
value is re-examined in the light of econophysics, presenting agent
based models in which the Ricardian theory of value appears as an
emergent property. The authors present models giving rise to the
class distribution of income, and the long term evolution of profit
rates in market economies. Money is analysed using tools drawn both
from computer science and the recent Chartalist school of financial
theory.
Covering a combination of techniques drawn from three areas,
classical political economy, theoretical computer science and
econophysics, to produce models that deepen our understanding of
economic reality, this new title will be of interest to higher
level doctoral and research students, as well as scientists working
in the field of econophysics.
|
Classical Econophysics (Paperback)
Allin F. Cottrell, Paul Cockshott, Gregory John Michaelson, Ian P. Wright, Victor Yakovenko
|
R1,666
Discovery Miles 16 660
|
Ships in 12 - 17 working days
|
This monograph examines the domain of classical political economy
using the methodologies developed in recent years both by the new
discipline of econo-physics and by computing science. This approach
is used to re-examine the classical subdivisions of political
economy: production, exchange, distribution and finance. The book
begins by examining the most basic feature of economic life -
production - and asks what it is about physical laws that allows
production to take place. How is it that human labour is able to
modify the world? It looks at the role that information has played
in the process of mass production and the extent to which human
labour still remains a key resource. The Ricardian labour theory of
value is re-examined in the light of econophysics, presenting agent
based models in which the Ricardian theory of value appears as an
emergent property. The authors present models giving rise to the
class distribution of income, and the long term evolution of profit
rates in market economies. Money is analysed using tools drawn both
from computer science and the recent Chartalist school of financial
theory. Covering a combination of techniques drawn from three
areas, classical political economy, theoretical computer science
and econophysics, to produce models that deepen our understanding
of economic reality, this new title will be of interest to higher
level doctoral and research students, as well as scientists working
in the field of econophysics.
A number of widely used contemporary processors have
instruction-set extensions for improved performance in multi-media
applications. The aim is to allow operations to proceed on multiple
pixels each clock cycle. Such instruction-sets have been
incorporated both in specialist DSPchips such as the Texas C62xx
(Texas Instruments, 1998) and in general purpose CPU chips like the
Intel IA32 (Intel, 2000) or the AMD K6 (Advanced Micro Devices,
1999). These instruction-set extensions are typically based on the
Single Instruc tion-stream Multiple Data-stream (SIMD) model in
which a single instruction causes the same mathematical operation
to be carried out on several operands, or pairs of operands, at the
same time. The level or parallelism supported ranges from two
floating point operations, at a time on the AMD K6 architecture to
16 byte operations at a time on the Intel P4 architecture. Whereas
processor architectures are moving towards greater levels of
parallelism, the most widely used programming languages such as C,
Java and Delphi are structured around a model of computation in
which operations takeplace on a single value at a time. This was
appropriate when processors worked this way, but has become an
impediment to programmers seeking to make use of the performance
offered by multi-media instruction -sets. The introduction of SIMD
instruction sets (Peleg et al."
Computation and its Limits is an innovative cross-disciplinary
investigation of the relationship between computing and physical
reality. It begins by exploring the mystery of why mathematics is
so effective in science and seeks to explain this in terms of the
modelling of one part of physical reality by another. Going from
the origins of counting to the most blue-skies proposals for novel
methods of computation, the authors investigate the extent to which
the laws of nature and of logic constrain what we can compute. In
the process they examine formal computability, the thermodynamics
of computation, and the promise of quantum computing.
Computation and its Limits is an innovative cross-disciplinary
investigation of the relationship between computing and physical
reality. It begins by exploring the mystery of why mathematics is
so effective in science and seeks to explain this in terms of the
modelling of one part of physical reality by another. Going from
the origins of counting to the most blue-skies proposals for novel
methods of computation, the authors investigate the extent to which
the laws of nature and of logic constrain what we can compute. In
the process they examine formal computability, the thermodynamics
of computation and the promise of quantum computing.
A manual for the Glasgow Pascal compiler that supports parallel
processing.
A sweeping history of the full range of human labor Few authors are
able to write cogently in both the scientific and the economic
spheres. Even fewer possess the intellectual scope needed to
address science and economics at a macro as well as a micro level.
But Paul Cockshott, using the dual lenses of Marxist economics and
technological advance, has managed to pull off a stunningly acute
critical perspective of human history, from pre-agricultural
societies to the present. In How the World Works, Cockshott
connects scientific, economic, and societal strands to produce a
sweeping and detailed work of historical analysis. This book will
astound readers of all backgrounds and ages; it will also will
engage scholars of history, science, and economics for years to
come.
|
|