Search This Blog

Showing posts with label Philosophy of Information. Show all posts
Showing posts with label Philosophy of Information. Show all posts

Monday, 18 May 2015

Informationist Tweets

Check out Informationist (@Informationiste): https://twitter.com/Informationiste?s=09

Wednesday, 6 May 2015

Physicalist Information Realist Moment: Burns Bay, Lane Cove West

Observations from physicalist information theory and physical realist philosophy of information:
  1. A subjectivist about information (someone who thinks that information requires - or existentially depends upon - a subjective observer or signal-receiving and processing agent) has to explain why the the photons reflecting from the environment of the water's surface is necessarily any less storage of structural information than their mental representation (reducing to the processing of signals in the retinotopic map, perhaps) and the interaction between the two. They must explain why the green in the reflection is not providing information originating in the DNA in the tree, and why this in turn is not objectively existing structured information that would still be there without any observers.
  2. An eliminitavist about information (information is not some thing in the world - just a label we use for when objective frequentist probabalistic uncertainty is reduced for an observer)  has to answer the same challenge.
This argument is not intuitive argument, but RAA. If subjectivism and eliminativism are correct, then there is nothing that we would normally call information in the DNA of trees. That seems simply wrong.









Existential Moments #2: Fletcher Lookout

As an information theorist and philosopher of information I can make the following observations:

The camera captures only a tiny part of the physical and structural information in the environment - much less than what the ocular organs capture. The camera of course stores its information far more permanently and accurately (although perhaps as an artificial electrically operated digital device its lifespan is limited by bit corrosion and technology redundancy to less than the lifespan of my memories).




Existential Moments #1: Wentworth Falls and the Slack Stairs

When looking out over the Blue Mountains, and especially when hugging the rock on the Slack Stairs climb, Bishop Berkeley starts to look pretty stupid. Really.


From Wentworth Falls Lookout




From Wentworth Falls Lookout: E-NE

From Fletcher Lookout

From Wentworth Falls Lookout

Rocket Point Lookout Towards the Falls

Mid Slack Stairs

Mid Slack Stairs

Under Mid Slack Stairs

Web. No resident visible.

Slack Stairs: Fern

Up at Lower Wentworth Falls from below Slack Stairs and to the East.

Lower Wentworth Falls and Lagoon from below Slack Stairs and to the East.

Lower Wentworth Falls from below Slack Stairs and to the East.

Slack Stairs Cutting

Slack Stairs Cutting Looking Up


Panoramic from Wentworth Falls Lookout.



Wednesday, 25 December 2013

Do Complex Replicating Molecular Machines Evolve Using Stored Quantum Information?

The first article in the informationist biosciences section of Informationist eMagazine is an analysis and review of lengthy paper presented by molecular and computational biologist A. V. Melkikh. Melkikh pursues the argument for partially directed evolution of replicators in protein synthesis using evolutionary game theory. He proposes that a lot of information is required to guide the process of selecting workable conformations of biomolecules in order to arrive at new replicators in new environmental niches, and that the most likely source of this a-priori information is the quantum structures of important biomolecules.

Monday, 16 December 2013

A Discussion with Alexander Gillett About The Metaphysics of Information, Physicalism About Information, and Non Eliminative Ontic Structural Realism

Ontic Cafe 6 beans Strength: Academic Specialists and Masochistic Layphilosophers only.


My PhD thesis presents a case for physicalism about information: that information is and reduces to the physical and that there is no information without physical structures. An initial response to this position is often that it is simply a restatement of token physicalism. A lengthy debunking of this response is not within the scope of this post. However, and easy way to demonstrate it is not sound is to point out that numerous philosophers of information and information theorists have proposed that non-spatiotemporal non-causal structures are a sufficient condition for the existence of information. This means that there are non-phuysicalist conceptions of the nature of information itself. There are other such conceptions.
Another philosophical expression of physicalism about information, then, is that it is an anti-Platonist conception of information. This is at best a partial statement, but it serves to highlight the relevance of a physicalist position. Mine is also a non-subjectivist, non-pluralist (nominally) and non-statisticalist conception (the later being controversial.) I will not attempt to explain these positions extensively here. According to subjectivism about information, information is only realised in the context of being received and (usually) perceived by some kind of receiving agent (the agent need not be a cognitive agent - but can be simply an organismic consumer as is the case with the teleosemantic theories of Ruth Millikan and Nicholas Shea.)
The dissertation (Physicalism About Information with Applications) draws on resources from applied mathematical theories of information - algorithmic, quantitative and computational. I use premises and arguments put by quantum computing and information theorists like Rolf Landauer, David DiVincenzo and Daniel Loss. For the metaphysical conception, the argument relies to a significant extent upon the kind of non-eliminative ontic structural realism put forward by James Ladyman and Steven French. However, I do not commit myself to any particular OSR perspective. Teh leading philosopehr of information in the world today - Luciano Floridi - has demonstrated that part of the project of philosophers of information should be to investigate the nature of information. This is, ironically and perhaps unexpectedly at this time in the history of philosophy, a new metaphysical project. Those materialists familiar with the work of Rudolph Carnap and Willard van Ormand Quine will realise the relevance of this.

Recently I gave a presentation at the University of Melbourne Australian Postgraduate Philosophy conference on the subject of explanation in mathematics. The argument I presented, stated briefly, was that if mathematical abstracta are explanatory because they are informational, then there is a strong case to be made that, against the preponderance of philosophical intuition and thought on the matter, abstracta must be physical. This is not the argument that I am interested in making here, however.

To get to the point of this post, an attendee at the conference, Alexander Gillett, was interested in my use of ontic structural realism. Alex is a specialist in OSR, and had some interesting questions and perspectives to offer on the use of NOSR as a partial metaphysical basis for a physicalist conception and ontology of information. Alex kindly agreed to the publishing of relevant parts of our recent email communications on the matter. I offer them here for the interest of the Ontic Cafe audience.

First Exchange

Dear Bruce, Last month I attended the post graduate conference in Melbourne and I found your paper on OSR, information theory and a critique of Platonism (a.k.a. Mark Colyvan) fascinating. I have been researching structural realism for a while now, and having come from a different direction it was interesting to hear you discuss the information-theoretic versions of SR. I have only recently begun examining these approaches and trying to get to grips with them. I have read Floridi's "defence of ISR" - do you recommend any other papers by him or others?

After your talk I was asking you about an adequate definition of nature or the physical, and about Ladyman & Ross' work on real patterns, but unfortunately the time for questions ran out when you were about halfway through an answer on defining the physical. I was wondering whether you would be willing to discuss these issues now. Additionally, you made reference to a paper called "An Informational Physicalist Ontic Structural Realist Conception of Mathematical Abstracta" which sounds really interesting. Would you mind sending me a copy of this paper or any others that I might read?

You also talked about physical information sources or information-bearing structures as the basis of any possible explanatory account. In this regard, would I be right to say that information comes before meaning? Can I also ask, how do you consider information metaphysically? . . . Anyway, thanks for the talk, it was probably the most interesting thing I heard over the course of the conference and I'm sorry I didn't manage to catch up with you in person. Regards, Alex Gillett
Hi Alex,

I am pleased to hear from you Alexander. . . .

Yes I put information before meaning in a sense, but I take information to exist physically and as such to be intrinsically semantic on a particular causal basis.

I have a paper in review called “Information is Semantic but Has no Alethic Value”. [I will upload the pre-pulbishing version of this paper to Ontic Cafe in the next instalment]

I have a couple of papers that I can send you. I have to submit one of them to a journal this week first (Physicalist Ontic Structural Realism About Information in Applied Mathematical Theories of Information). The other paper I will send you tomorrow.

Best Regards,

Bruce.
Hi Bruce, . . . I went to a conference the other week and it left me even more convinced that I need to engage with information theory more. Just a quick question: since I've come at information theory from a structural realist POV, what is it about Floridi's approach that you disagree with? Do you think you're approach is more amenable to some metaphysical version of SR? Regards, Alex
So, brief bio out the way, I'd like to say thanks for the paper, and if you are willing to send anything else I'd be delighted to read it.

Just a few comments vis-a-vis structural realism (SR). Both Ladyman & Ross (2007) and French & Ladyman (2003) - your paper seems to mix references here perhaps? - are more readily seen as promoting an eliminative metaphysical version of SR. They oppose the notion that structure is fundamentally causal, arguing that since fundamental physics is currently agnostic about causality, metaphysics should not posit it as a fundamental feature of reality (see Chapter 5 of Every Thing Must Go, 2007). Additionally, they propose an eliminative account of objects. Some have pointed out that this is inconsistent and they have subsequently relented and allowed in thin-objects - which are purely relational. Michael Esfeld & Vincent Lam have argued for this position more consistently (along with Fred Muller & Simon Saunders) - what they call Moderate Ontic Structural Realism (MSR) - on the basis of empirical evidence from fundamental physics and philosophical arguments surrounding the methodological issues relating to how to cash out objects in an ontology. Additionally, they also argue that reality is fundamentally causal or dynamic. As such, perhaps their approach is more conducive to your own?

In addition to these issues, Ladyman & Ross (2007) propose an ontological reading of Daniel Dennett's 'Real Patterns' as the basis of their ontology, and they couch this in information-theoretic language (see chapter 4). Their reason for doing so is to avoid explicit reference to causal process and to replace this with 'information-bearing'. What interested me about your work, was given your background, you could probably appraise this approach far better than I. I was wondering whether you had had a chance to engage with this material? The reason I ask is that you are the first person beyond Floridi to be explicitly discussing SR and information theory - and you disagree with the former. This is what interested me greatly.

Anyway, thank you for sending the paper and I hope to hear from you soon;
Hi Alex,

Thanks for your response.

I think it is the 2003 paper (it is possible that I have mixed references) where when challenged by Cao to describe the kind of structure that they were talking about.

I have read the material that you are referring to. (Not all of it recently.)

My position is hyper-physicalist. People find it annoying and think I am making a mistake. But I have not heard a good reason to drop it yet. It is difficult to argue for because it requires informational structure to be real structure, and this involves the structural realist mathematician to be a physical structural realist. It also requires that there is a distinction between statistical measures of various kinds and what is being measured. In other words there is only one kind of real structure that is a necessary condition for the existence of information. It is non-eliminative physicalist causal structural realism. Yes – the dumb ass bumping together kind. This is highly unintuitive – but then so is the statistical conception of the measure of information (a MEASURE.)

I cannot do justice to it in one email, but I will send you some papers soon.

Interestingly, they start that paper (‘THE DISSOLUTION OF OBJECTS: BETWEEN PLATONISM AND PHENOMENALISM’) with:


One of the motivations for Ladyman’s ‘ontic’ form of SR is that it offers the realist some hope that she may be able to get away with carrying less metaphysical baggage than the ‘standard’ realist without having to fall into the clutches of the constructive empiricist (2003, 73.)


And in it go on to say that:

Cao persists in lumbering us with two seemingly contradictory identifications that we thought we had rejected in our paper. The first concerns the identification between physical structures and mathematical ones, which Cao then takes to imply that the ontic structural realist must be a Platonist.


I see why Cao does this. Because it looks like the only way that L & F have to go. As I will mention below, I think that Cao’s misinterpretation points to the right view for the metaphysics of information.

Then:

Now, we did say that the distinction between the mathematical and the physical may become blurred, particularly if the mark of the latter has to do with ‘sub-stance’ or individual objects or the like. Nevertheless, blurring does not imply identity. The mathematical can be trivially distinguished from the physical in that there is more of it; there is more mathematics than we know what to (physically) do with, which is what Redhead expressed with his notion of ‘surplus structure’. What makes a structure ‘physical’? Well, crudely, that it can be related – via partial isomorphisms in our framework – to the (physical) ‘phenomena’. This is how ’physical content’ enters. Less trivially, the mathematical can be distinguished from the physical in that the latter is also causal, (2003, 75)


Something very interesting, and controversial, happens with my physicalist conception of informational and real structures, and the concept of mathematical abstracta that goes with it. The distinction between the structural realism of the mathematician and that of the metaphysician in the philosophy of science IS eliminated. Only the latter physicalist conception is retained for structures that can be considered informational or information bearing.

There are lots of ways to approach this unpopular position. First however, here is what Ladyman and French then said to Cao:

Cao understands us as advocating ‘the dissolution of physical entities into mathematical structures’. But, first of all, by ‘dissolution’ we mean metaphysical reconceptualisation. And secondly, as we tried to em-phasise, to describe something using mathematics does not imply that it itself is mathematical – the structures are what they are and we describe them in mathematico-physical terms. Let us put it as clearly as we can: we are not mathematical Platonists with regard to structures (2003, 75.)


So mathematical structure does reduce to physical structure for Ladyman and French. It must. I am saying that this also means that there is no information without physical structure, and that in fact all information must reduce to physical structure.

Shannon’s work creates big problems here, but only because it gets misinterpreted. Shannon does say that the model of a source (stochastic process) is also an information source, and he uses a statistical ‘measure’. But it is the measure that is statistical – not the information. The Boltzmann conception of entropy makes this worse not better. Even though entropy is a probabilistic measure in Boltzmann’s theory, the entropy or disorder is in fact physical. There is no real entropy without the physical configuration of the physical particle system. Saying that information is statistical is nonsensical according to Shannon’s account, especially if information is entropy. The measure of entropy is a measure of physical entropy, and even if you do not apply it to a real physical system, you don’t have real entropy without the physical system. Likewise with information.

What about relations between physical point, entities, and structures? What about mathematical patterns that are not physical structures? Aren’t they informational? Kolmogorov did not think they were even real. Neither do I. I therefore think that they are not informational (neither did Kolmogorov – his data objects were all strings of physical symbols of one kind or another. Tables of random numbers were tables of physical symbols for Kolmogorov.)

What about the fact that mathematical patterns seem to carry information in the sense that they can tell us things about a system even if the system is not realized? This seems to suggest (per Colyvan and Lyon) that there must be real Platonist informational structures. Ladyman and French do not want to be Platonist, but they misinterpret Shannon on the nature of information. I have suggested a way in which possibility spaces and probability spaces can in fact be physical structures (controversial but it is a solid argument) – but this is not what Ladyman and French do.

The best way to bring this out is with an error made by Dretske (and I think that it simply is an error.) Dretske demonstrates (1981, 26-31, 38) that there is a real distinction between causality and information. However, the conclusion that does not follow is that there can be information transmission and generation without causality. This is demonstrably false (I provide an argument in my thesis.) it also follows from this that if Ladyman and French pursue a statistical conception of information like Dennett, they have not eliminated causality at all.

No causal physical structure – no real information.

Kind Regards, Bruce.
Hi Bruce,

Thanks for the emails. I've got to stay in all day waiting for a delivery, so the article on information theory and biology shall be both a great time killer and brain-food.

I really enjoyed the responses to my questions - very much appreciate the frankness. In particular, what interests me is what I shall refer to as the "Blurring problem". I have recently had a paper which gives an overview of OSR and this issue of dissolution of the distinction between the mathematical and the physical accepted to the Polish Journal of Philosophy. In this I explore the various (inadequate) responses to this very problem. Now, although in the paper you cite, Ladyman agrees with French that causation can be used to "trivially" articulate this blurred boundary; mostly Ladyman has rejected the notion of causality as a distinguishing factor in solving this problem (see Every Thing Must Go and a recent interview on rationallyspeaking.org maintains the agnosticism).

Furthermore, I think their attempted response to Cao just begs the question via appealing to a tautology: the physical = the physical.

Additionally, Ladyman's relationship with platonism is also confusing - in some places he rejects it out right, and in others he advocates a naturalised version. And in the interview above he seems affable to pythagoreanism or platonism (a la Max Tegmark's extreme OSR mathematical universe hypothesis) although not committed to either.

For all of these reasons it seems to me that your position is more amenable to Esfeld and Lam's perhaps. For them physical structure is causal and thus distinguished from the mathematical in this sense, and they reject platonism in this sense. (I have attached papers by Esfeld & Lam for your perusal if you are interested)

I think such a view is strengthened by your comments regarding the difference between a measure and the thing that is measured. This seems to be similar issue to the blurring problem, where I think an account of scientific representation is missing. Ladyman & Ross's recent paper, "The World in Data" (2013), combines these two issues. In addition to stating that physical structure is modal, Ladyman & Ross conclude this paper by stating, following CS Peirce, (and presumably extrapolating from their statistical RP approach at representation) that...


The fundamental empirical structure of the world is not mathematical but statistical. And there is no such thing as purely formal statistics. The 'principles' of statistics are simply whatever dynamics emerge from our collective exploration of, and discovery of patterns in, data. [...And this is premised on...] the grandest discovery of the twentieth century, that fundamental physics, and therefore reality itself, are irreducibly statistical. What is the world? It is the endless weave of patterns to be extracted from noise, at an endless proliferation of mutually constraining scales, that we will go on uncovering for as long as we have the collective and institutional courage that comes from love of objective knowledge, the great moral core of the Enlightenment.


On your account I take it this is rather like mistaking the finger pointing at the moon for the moon itself? However, it could also be interpreted in a more reasonable fashion when read in the light of there comments at the close of chapter 3 ETMG (pg189). In trying to unify fundamental physics with the special sciences they state that there are two options: either [a] reality is composed of infostuff; or [b] the world is not made of anything but that information is the main concept for understanding the objective modal structure of reality. They choose view two, in which case they see some form of information-theoretic notation as the basal and collective account of scientific representation. This would again affirm their refusal that dissolution = identity, but point to an obscure fact that scientific representation is so accurate as to entail partial isomorphism - an astonishing fact colloquially referred to as the Wigner Puzzle by Colyvan. Textual support for this interpretation comes from their quoting of Zeilenger: "it is impossible to distinguish operationally in any way [between] reality and information" (ibid). So although they may be wrong about the statistical aspect - is there a consilience with your view here, or how does it differ? Do you maintain a more robust distinction between informational representation and physical structure as source? I'm guessing this is your view given your last comment, but how do you cash out this distinction in a philosophically robust manner?

I'm not in the best place to judge these views, and I am more interested in exploring the terrain of SR and its connectivity to information theory and an account of scientific representation than throwing my hat in with any side in this debate. As such, I'm throwing out these comments and questions as a devil's advocate and interested to see how you respond.

What I'm most intrigued by is your proposal that possibility and probability spaces are physical. Does this amend the indispensabilitist arguments for mathematical realism in favour of a form of nominalism? How does it relate to other issues with the applicability of mathematics? Could you perhaps spell out the argument in a little more detail?

Looking forward to your responses, Regards,

Alex

Thursday, 14 November 2013

Are There Informational Laws of Genome Evolution? – Part One: Information and Molecular Bioscience

Ontic Cafe Four Beans Difficulty (Detailed but written to be accessible to laypersons)


This post is the first of two. The material is from a quite complex field of the philosophy of biology and information theory in biology. It would normally be at least 5 Ontic Café Beans, but I have watered it down to about three. Nonacademic philosophers should be able to get a reasonable idea what is going on and benefit from an introduction to one of the hottest topics in the philosophy of information and biology.

Different Conceptions of Information

Let’s start with a rapid introduction to the philosophy of information. There are several conceptions of the nature of information – of what information actually is. These conceptions vary dramatically in their details and ontological commitments – the things that are taken to be necessary to have for there to exist some information (in philosophical language we day “the necessary conditions” for information to exist.) Here a couple of quick examples will be instructive.

The most common understanding, and the most common scientific one, is that of quantitative information theories. In these theories one has information on a statistical or probabilistic basis. According to these conceptions information exists when there is a reduction in uncertainty about what is happening at an information source. An information source is any physical process that can be modeled statistically – about which you can say there is a certain probability of the next state of the source based on the current one. A simple example is you reading this sentence. Each word makes the next word more or less likely because of the structure of the English language and the rules (grammar and meaning) for making English sentences. The source is the text you are reading. This is the very example most used by the founder of modern quantitative information theory – mathematician Claude E. Shannon (The Mathematical Theory of Communication, 1948.)

The main alternatives to quantitative statistical theories are algorithmic theories. These involve measuring the complexity of strings of symbols or what are called data objects. Any sequence of elements can be a data object. The longer and more complex the data object – the more information it has. The most famous is that developed  by Russian materialist mathematician Andre Kolmogorov. In Kolmogorov’s theory the amount of information in a data object is given by the length of the program or description required to generate or construct the data object.

Semantic Information

Quantitative statistical measure based conceptions and definitions of information have often been seen as inadequate because as Claude Shannon himself wrote in The Mathematical Theory of Communication, they do not attempt to capture any meaning of the symbols that are transmitted. His predecessor R.V.L Hartley wrote that ``[i]t is desireable therefore to eliminate the psychological factors involved and to establish a measure of information in terms of purely physical quantities'' (Transmission of Information, 1928, 536.)

Shannon’s peer and mentor Warren Weaver first observed that in future it would be desirable to formulate a conception of information that accounted for meaning. Later theorists came to refer to such conceptions as theories of semantic information.  There have been several of these – mostly naturalistic – offered by both mathematicians and philosophers. The first notable attempt was by the famous Vienna circle mathematician and philosopher Rudolph Carnap. Carnap joined with mathematician Yehoshua Bar-Hillel to formulate a theory of semantic information in which the semantic information content of a sentence was determined according a to a logical formulation (1953.) In lay terms the information content of a sentence is the set of all sentences that are false if that sentence is true.

Later various other conceptions of semantic information.  Philosopher Fred Dretske adapted elements of Shannon’s theory (1981 – Knowledge and the Flow of Information.) Mathematician Keith Devlin produced another logical conception (1995- Logic of Information.) More recently, Luciano Floridi has produced a theory of semantic information that extends and adapts ideas put forward by Devlin and Bar-Hillel and Carnap. It is different in that it requires information to have alethic value – to be based upon data which are truthful according to certain fairly complex criteria (Floridi, Information in The Blackwell Guide to the Philosophy of Computing and Information - 2004, Information – A Very Short Introduction - 2011, The Philosophy of Information - 2012.)

The idea of semantic theories of information is that information and meaning are directly related somehow. Usually meaning is thought to involve truth value of some kind.

Meanwhile in Physics and Biology

An enormous part of the story of our understanding of the nature of information comes from physics. I will not say much about that here, except to say that physicists often regard information to be a physical thing. Another pioneer of information theory – the father of Cybernetics Norbert Weiner – once said that “information is information, not matter or energy...no materialism that does not admit this can survive...'' (1962, Cybernetics: or Control and Communication in the Animal and the Machine.) No physicist has claimed that information is matter or energy, but quantum computing pioneer Rolf Landauer was sure that it is physical (Information is a Physical Entity, 1996.)

An enormous amount of philosophical and technical thought about information comes from biology. This is not so surprising given the importance of the concept of information to genetics and DNA science. Inherited traits from one generation to the next of phenotypes (organisms) are described in terms of information. So is what is referred to as the central dogma of molecular biology: that information cannot go from the phenotype (the developed body) to the genotype (the gene/DNA.) In other words, if I cut my hand it will not mean that any child conceived by me in the future will have the same cut on their hand. More recently the central dogma has come under challenge from the field of epigenetics. In epigenetics, other things in addition to the gene – the DNA itself – are thought to contribute heritable information or information that is passed from one generation to the next. This can include processes within the cytoplasm of the cell, or even things in the organisms environment like the structure of nests in which young are reared. Still - it is often information transmission that is of interest.

At least since Crick and Watson’s discovery of the double helix structure of DNA in 1971, biologists and philosophers of biology have been contemplating and arguing about the nature of information and information transfer in DNA and biosynthetic processes. Biosynthetic processes are processes in which smaller molecules are combined to form more complex molecules that have some more complex function (processes involving such things as the manufacture of protein and other biological structures from genetic material.)  Such processes are frequently described in terms of information.

Codes, encoding, transmission, and even information compression have been discussed as real in the processes of genetic material.

This all raises a question, however. We saw in the previous section that there are many conceptions of information. So which is the right one for biology? Molecular bioscientists and philosophers of biology are still trying to figure that out. There are even arguments about whether genetic information is semantic or not - if it has meaning and if so in what way (See recent work by Nicholas Shea on what he calls Infotel semantics. The idea is that the meaning of genetic information is determined by its function.) Some philosophers of biology even have what is known as an eliminative conception of information in biology: they eliminate it from the discussion completely or partly as a useless metaphor that is confusing and does not explain anything real (See Griffiths, Paul E. Genetic Information – A Metaphor in Search of a Theory http://philsci-archive.pitt.edu/89/1/Genetic_Information_etc.pdf.)

Are There Informational Laws in Genome Evolution and the Evolution of Protein Synthesis?

This entire area of the nature of information in molecular bioscience is complex and keenly debated. However, in this two part series I am interested in a very specific part of the debate – one that is perhaps the most exciting and relevant to philosophy in general and not a little evolutionary science today. It involves the question of how protein synthesis evolved by natural selection. The process of protein synthesis is an incredibly complex biosynthetic process that has only recently come to be well understood. The complexity of the processes of protein folding and gene splicing meant that the details of these processes were wholly mysterious up until recently. How such processes came to evolve naturally to their current state is an even more challenging mystery.

Above is an artist's representation of the proces of protein synthesis from DNA via processes of DNA transcription and translation into a chain of amino acids and finally into a folded protein. The process is staggeringly complex, with only the most basic fundamental steps represented here. Molecular bioscientists usually take it for granted that there is information transmitted form the DNA to the protein. A much larger question, however, is how the information of the entire process and the structures involved in it came to be as it is by evolutionary processes. Eugene V. Koonin has proposed that "Although a complete physical theory of evolutionary biology is inconceivable, the universals of genome evolution might qualify as “laws of evolutionary genomics” in the same sense “law” is understood in modern physics." (http://www.ploscompbiol.org/article/info%3Adoi%2F10.1371%2Fjournal.pcbi.1002173) The details of this theory involve the laws being expressed largely as statistical and informational.

Monday, 11 November 2013

Superluminal Information Transmission

By Bruce Long (B App Sc, BA Hons 1, MPhil, PhD candidate.)

Ontic Cafe 4 beans difficulty rating: 

The philosophy of physics is all abuzz lately - on top of its usual buzz. The popularization of the failures of string theory and the work of science popularisers like Lawrence Krauss, Peter Woit, John Gribbin, and  Neil DeGrasse Tyson have co-incided with a mounting interest in metaphysics and the philosophy of physics among physicists and philosophers alike.

Spooky Action At A Distance
When Einstein produced his theory of special relativity and work on quantum mechanics, he made a disturbing discovery - what he called "spooky action at a distance". Einstein's Spooky action at a distance is better known among physicists as non-local effects or quantum entanglement.

What is entanglement and what does it have to do with information transmission? Information transmission depends upon cause and effect. If you can make cause and effect happen instantaneously, then arguably so does information transmission.

Briefly, Einstein's theory predicted that two quantum systems (very small systems) like photons or electrons or other particles would affect each other's states instantaneously even if they we separated by large distances. The systems (particles) start out together in the same place in space and time, and then can become widely separated. After they are separated by a significant non-trivial distance - a measurement of one that changes its state will necessarily result in an instantaneous change in the state of the other in the opposite direction (quantum particles have different rates of spin and what is called angular momentum, and there is a direction of that spin - usually up or down.) If one particle is spin up, and you measure it and it becomes spin down - then the other particle will do the opposite even if has traveled a long way off.  Instantaneously - with no delay at all. Spooky, said Einstein.

Now, instantaneously here literally means instantaneously. Not at the speed of light, or near it, but much much faster. In fact - speed or velocity is not even the right thing to talk about. The cause-effect of entanglement or non-local effects is immediate. It would be like the pitcher throwing the baseball, while the batter hits exactly the same ball at exactly the same time - and the pitcher is on Earth and the batter on the moon.

There is theoretically and practically no speed - just instantaneous change of the state of one physical system based upon the change in the state of the other (usually when the other system is physically measured.) It's almost like if a fan watches the baseball pitcher, then the action at a distance will automatically happen. If the pitcher is pitching, you know the batter is batting at that moment - with the same ball. If the fan goes and hands the pitcher a bat, there is extremely high likelihood (virtual certainty) that the distant batter will have become a pitcher at that exact same moment. It is THAT weird.

Einstein did not like spooky action at a distance at all because it suggested that the usual understanding of cause and effect and causal chains in physics was largely wrong for quantum mechanics. Either something was wrong with the mathematics, said Einstein, or there was something very weird going on in the universe. He came to the conclusion that there must be intermediate causal structures between the entangled quantum systems that had not been detected physically yet. This theory is called hidden local variable or local realism theory. The realism means that there is really something there doing the entangling, and local means that spooky action at a distance just does not happen but instead there is a hidden intermediate causal structure that is local to the quantum systems.

A Speed for Entanglement After All

Now many readers will be aware of Einstein's maxim that no body without zero rest mass (no mass when not moving relative to any spatiotemporal frame of reference) can move faster than the speed of light. That's an immutable law of the universe - right? Well - maybe. It is action at a distance. Einstein's hidden local variables have not been found - and in fact the theory has turned out to be unsupported by empirical experimental findings.

Recently some Chinese physicists have tried to measure the speed of non-local effects (refer to the list of reference at Physics News.) I have just said that there is no speed involved - so what is this experiment about? Well, their findings don't provide much comfort. What they proved experimentally - assuming no discovery of errors in the future - is that if there is a speed of entanglement then it has to be at least 10 000 times the speed of light. They do not know if it is the limitations of their equipment that is causing the measurement value. The speed might be even higher - or no speed at all as suggested above.

Bell Theorems- Is Spooky Action at a Distance Real? Or are there hidden local intermediate causal structures?

The theoretical physicist John Bell made things even worse for Einstein in 1964 with a theory that suggested that the mathematical predictions of quantum mechanics did not fit with the mathematical theory of hidden local variables that he himself had developed.

Things got worse still for the hidden local variable theory when Bell's findings were supported by experiment in 1972 by John Clauser and Stuart Freeman. Alain Aspect did it again with experiments in 1981.

What Now - Superluminal Information Transfer?

In the best mathematical and scientific theories of information, information transmission involves loss due to signal noise and is limited by the transmission rates permitted by the transmission medium. Some information theorists assert that information transfer is only about the covariance - the simultaneous changing - of one structure with another in such a way that the state of one system (an information receiver) tells one something about the state of the other (the information source) with a certain degree of probability.

Now, normally entanglement is not regarded by physicists as an information channel on a statistical basis since there is no uncertainty about the state of one system if the state of the other is known. For statistical formal measures of information one requires statistical uncertainty - because according to those measures information just is a reduction in uncertainty or an increase in probability about the next state of the source based on the current state, or else based upon signals received that were caused by the current state. (See John Gray's text Information Theory and Entropy: http://ee.stanford.edu/~gray/it.pdf ; See also Warren Weaver's introduction to Claude E Shannon's The Mathematical Theory of communcation.

I will put aside this consideration of the statistical conception of information as an impediment to quantum entanglement channels for information transmission. This is because the statistical conception is only one (albeit very important) element of the transmission of information, and only one conception of information transfer.

Whither the Second Law of Thermodynamics: Entropy defeated?

The point is that if entanglement is a real physical causality - if it really involves some kind of instant physical cause effect interaction, then that means that information can be transferred instantaneously. If that is true, then many things are unclear. Because of entropy and the second law of thermodynamics, causality is limited in a causally closed universe. Energy loss and impedance limit transmission speeds in predictable ways. 

However, if non-local quantum information transmission is true, then it looks very much like we might be able to send information without any signal loss at all. Even weirder - information might be transmissable with no intermediate causal pathway or structure. This is spooky indeed.

References:

J. S. Bell, (1966On the problem of hidden variables in quantum mechanics, Rev. Mod. Phys. 38, 447. 







POLLS:

Does information exist apart from perceiving agents (without some thinking organism to perceive it)?

Is Philosophy Relevant to Science?