Tuesday, May 12, 2009

Evolution of the Computer Engineer

Let's look at living things, from the eyes of a computer engineer. Three components are necessary:
  1. Hardware
  2. Software
  3. Intelligence and ability to learn
Then we have to consider that so far, software has not been able to create any intelligence or ability to learn analogous to a human. Immediately the analogy fails, for computers can never be people, no matter how many Asimov stories we read. Next the software must be able to somehow influence the hardware. Much like a robot programmed to repair itself, and construct and programme new robots just like it. So the software must be able to cause the hardware to produce a manufacturing facility, and to produce the software to programme that facility. This facility must be able to produce the hardware and software which the first had, such that it too will be able to reproduce.

Now then, how did this hardware and software come about initially? Well, a computer engineer must have made it. It couldn't arise by itself. Neither the hardware, nor software could arise spontaneously. No amount of randomness and time can overcome a logical and mathematical impossibility - one will never be equal to two. If this is the case with something so simple as a computer or robot, then how much more is it impossible for the far more complex human?

What could provide the information, the software and hardware to produce the manufacturing facility and operate it to produce hardware and software with the same ability? Computer engineers ought to know that goo-to-you-via-the-zoo evolution is absurd. Chemists also ought to know this, physicists ought to, and especially biologists. But the atheist can't figure it out. He tries, but his a priori determination to hate the God that he doesn't believe in, keeps him from seeing sense.

Psalm 53

1The fool hath said in his heart, There is no God. Corrupt are they, and have done abominable iniquity: there is none that doeth good.
2God looked down from heaven upon the children of men, to see if there were any that did understand, that did seek God.
3
Every one of them is gone back: they are altogether become filthy; there is none that doeth good, no, not one.
4
Have the workers of iniquity no knowledge? who eat up my people as they eat bread: they have not called upon God.
5
There were they in great fear, where no fear was: for God hath scattered the bones of him that encampeth against thee: thou hast put them to shame, because God hath despised them.
6
Oh that the salvation of Israel were come out of Zion! When God bringeth back the captivity of his people, Jacob shall rejoice, and Israel shall be glad.
But even to these, people no worse depraved sinners than myself apart from the grace of God, Christ sends the call of the Gospel:
"O ye simple, understand wisdom: and, ye fools, be ye of an understanding heart."
- Proverbs 8:5
May the simple be wise, may Jacob rejoice, and may Israel be glad, for the glory of Christ!

20 comments:

Dr. Arend Hintze said...

But we are able to do exactly that: simple primordial (artificial/computer) life to highly complex (irreducible complex) algorithms, form, patterns, and robot controllers.
They are not human equivalent, but that is a matter of time, nothing else.

Cheers Arend

Samuel Watterson said...

1) Even if this were true, the agent of such a feat would be us, not the machine. And we could also conclude that all the information came from us, and was not spontaneously generated. Information can be reprocessed, garbled, randomised, and transferred, but it cannot be increased.

2) It takes a great deal of naivety to say, "that is a matter of time, nothing else." There are logically/mathematically unsurpassable obstacles in the way. Of course part of the problem is adequately agreeing on a definition of what is equivalent to human.

May I recommend this paper? http://creation.com/information-science-and-biology

Dr. Arend Hintze said...

1) In an open system like the earth we have energy coming from the outside "hitting" self replicators which self replicate and thus increase information. Same way evolutionary models work, you feed in energy, you either have a self replicator that increases his information by evolution, of you start a step earlier and make a self replicator out of random symbols (abiogenesis in the computer).
Information can be increased - in an open system - this is what we have, this is what we model

2)
a) If there is any general obstacle preventing information increase, evolution, adaptation, evolution of adaptive systems ... from happening, I would agree, but these arguments must hold true for even small systems, but we don't observe the obstacles you were talking about.
b) I agree, we don't have any working definition of human equivalent (except turing test - which is tricky to say at least), but that is not a counter argument for getting one.
c) Question: Do you say evolution can not make AI or human equivalents ever, or do you say that even if we have them running around, it would still be an argument for creationism?

Cheers Arend

Samuel Watterson said...

and of course, there is this: http://www.youtube.com/watch?v=V7__SWWSaGM

=D

Dr. Arend Hintze said...

I just read your link, which is not a paper, paper implies scientific journal and peer review... which is what this is:

http://www.sciencemag.org/cgi/content/short/312/5770/61

and this:
http://www.ploscompbiol.org/article/info:doi/10.1371/journal.pcbi.0040023

besides: Are you a Christian? Because mocking someone with a youtube clip is totally Jesus you know!
No worries, no hard feelings on my end, I got the joke :)

Cheers Arend

Dr. Arend Hintze said...

I just read your link, which is not a paper, paper implies scientific journal and peer review... which is what this is:

http://www.sciencemag.org/cgi/content/short/312/5770/61

and this:
http://www.ploscompbiol.org/article/info:doi/10.1371/journal.pcbi.0040023

besides: Are you a Christian? Because mocking someone with a youtube clip is totally Jesus you know!
No worries, no hard feelings on my end, I got the joke :)

Cheers Arend

Samuel Watterson said...

1) The "open system" argument does not work. "Just standing out in the sun won’t make you more complex—the human body lacks the mechanisms to harness raw solar energy. If you stood in the sun too long, you would get skin cancer, because the sun’s undirected energy will cause mutations.": http://creation.com/the-second-law-of-thermodynamics-answers-to-critics

2)a) To argue here, you would have to define what you understand to be equivalent to human (which you did not in b). Second, it is a natural law that information cannot arise without a sender, hence the principle by which SETI operates. http://creation.com/evolution-it-doesn-t-add-up
b) But it does make your argument meaningless.
c) Evolution cannot increase information, and therefore answers no questions about the origin of life as we observe it today.

Dr. Arend Hintze said...

Are you interested in understanding the world around you?

The argument with the sun you just made, is not even a counter argument. "Eating an apple in a library gives you enough energy to read a book and learn (increase your information)" would be a counter argument on the same level... and that is not even how information works. Go and read the link you posted carefully, at least the first part explains information theory kinda well as an introduction. But the conclusions he/she makes are just crazy...

May I ask what educational background you have?

Cheers Arend

Samuel Watterson said...

I am interested in understanding the world around me.

The argument I made certainly is effective. The difference between standing in the sun and eating an apple is that we have mechanisms to process the energy usefully from an apple, but not from the sun. Now, where did these mechanisms come from?

The conclusions made the articles I mentioned are not "crazy", but maybe you just don't like them. Or maybe you have misunderstood the statements that are made. Some statements are made to simply show how the evidence consistently fits with the biblical worldview - while others are indeed direct logical conclusions from evidence based upon various commonly accepted assumptions.

Tell me your educational background, and I'll tell you mine. I have a first-class honours bachelor degree in Environmental Science, and received A1s in all the sciences (Maths, Applied Maths, Physics, Biology, and Chemistry) before I entered university. My final year project was a thesis criticising global-warming theories, for which I also received an A1. I have since had a paper published on methane mitigation strategies for agriculture, and I am currently doing a masters by research in recycling techniques for advanced composite materials. I am also fairly widely read in general subjects theological and scientific. But an educational background doesn't win an argument.

Dr. Arend Hintze said...

I have a PhD in genetics and developmental biology, I work on evolution, AI, and network theory.

The question about your background was for me to understand what level of understanding you have. If you say you have a background in math and physics I can explain the logical fallacy you made on a much easier mathematical level.

You know how information theory works?
You take an ensemble say a bag of colored marbles red green black and white. You would compute the entropy using p log p, for a bag with a random distribution of balls, what information would you measure?

Cheers Arend

Samuel Watterson said...

Why are you attempting to explain a logical fallacy in the information theory argument to me? Surely Dr. Werner Gitt or Dr. Johnathan Sarfati would be more worthwhile opponents for such a discussion?

But in any case a random sample of coloured marbles has very little to do with the kind of information in the simplest kind of living organism. I suggest you contact the author of this paper (http://creation.com/information-science-and-biology) if you think there is a problem with it.

Dr. Arend Hintze said...

Well,

citing someone else without being able to defend the citation and the arguments in that citation makes you an acolyte but as far as I can tell you want to become a scientist (or already are one).

If you don't want that kind of discussion, no worries, I can let you be, but you were making claims that you should be able to defend by yourself.

I could just refer you to Shannons paper about information theory and let him make a point, instead I tried to explain something to you, and that is: A self replicator driven by outside energy will increase the information (something very well defined). The bag of marbles is an analogy, if you understand that you understand that your claim: information can not increase, is wrong.

But I can stop bothering you if you want?

Cheers Arend

Samuel Watterson said...

"A self replicator driven by outside energy will increase the information (something very well defined). The bag of marbles is an analogy, if you understand that you understand that your claim: information can not increase, is wrong."

No, it can only replicate - it cannot generate new information, which is the issue in question, not mere replication of pre-existant information. You ought to know that - and I wonder why you tried to present such an obviously fallacious argument, for someone who claims to know a lot about the area? Someone who also claims that I'm a mindless acolyte?

The reason I expect you to bring your objections to the author of the paper is so that if valid, the author can be corrected. I have no need to contact Shannon however, since others have done this.

Dr. Arend Hintze said...

Shannon is dead since 2001...

anyway, information as defined by Shannon (the inventor of information theory) can increase, and the marble bag example proves that.

Or imagine scrabble, a random sequence has no information, but if you but energy in it, and order it, you can increase the information of it.

Besides: If you actually think you can not increase information, your job as a scientist would be quite useless, wouldn't it?

Cheers Arend

Samuel Watterson said...

Sir, you are talking nonsense, and you seem to not understand the concept we are discussing at all. I already pointed out that the replication of information is not what we are speaking about. And replication itself is only possible if a mechanism is in place to harness outside energy for that purpose. If the mechanism is not in place, there is no way to harness the energy - leaving a scrabble board in the sun will not produce anything useful. So how can the mechanism be put in place to begin with? Or how could it ever become more than what it was designed to be?

I wonder did you read the paper I sent you? It explains that information is not simply statistical, it has five levels.

But give a scrabble board to humans, who have many mechanisms themselves for harnessing energy from food (which in turn, ultimately harness energy from the sun by other various mechanisms), and who also have a great deal of information, not only in their genetic code but also collected by means of their senses and cognitive abilities, and then random letters can be sorted into words.

What does this example illustrate? That information always requires a sender. When we design a computer program to play scrabble, you know exactly that it requires information to put the mechanism itself in place, and that once the mechanism is in place, energy can never let it create new information, only the information that it is programmed to generate. Good grief. I thought this was all very simple, but you seem to not want to understand, continually bringing up totally irrelevant arguments.

My job is this, using the information that gives me my mental capacity (both from nature and nurture), I process the information I receive via my senses, and transfer the information of my conclusions to others. Examining this system logically, it very clear that I have not increased the information, a conclusion from two premises is not a increase of information because the conclusion is ultimately contained within the premises. But I thought you were the maths expert?

Dr. Arend Hintze said...

Like I said, the first part of the linked page explains information theory quite well, the rest is just phantasies.
Information has been quantitatively defined by Shannon, and that is the term we use. It goes hand in hand with thermodynamics. The rest (level 2 to 5, please read them again) are statements about observations how the author thinks these things work, but no quantification.

Theorem 3 in that "paper" makes a statement about the "nature" of information, and missed the point that information is the power to predict, and what he calls semantic is nothing else than higher order correlations, which are well defined in information theory.

Everything what he calls theorems (after theorem 2) are ideas, nothing more.
Besides that Theorem 2 is only true if the noise is not uniform.

Your argument about: "the conclusion is ultimately contained within the premises" does not make a statement about information, in a information theoretic meaning, or only if you would look at the entire universe at once, but we are talking about open systems, self replicators, and information increase.

According to Shannon and Thermodynamics: You need energy to make order, order it information. Self replicators make order, because they make more of them selves, and they need energy to do so. The whole thing is dependent on energy influx which is only possible in an open system. Open system can be defined as a subsystem of a closed system, The entropy in the closed system increases, while locally information in a subsystem can increase.

Cheers Arend

Samuel Watterson said...

So you would say that there is essentially no difference between information and data? No qualitative difference between a well-shaken bag of marbles and wall of heiroglyphs? You seem to have called this difference a "fantasy". And if this difference is not merely a fantasy, then your worldview and speculative evolutionary science must reconcile with it.

I've no problem with the 2nd Law of TD, but you're consistently missing the point that even in an open system as a subsystem within a closed system, the 2nd Law still applies even though order can increase locally - the tendency to disorder is still ubiquitously present. A mechanism for increasing order locally cannot arise by means of itself, or from simply unharnessed energy. Such a mechanism itself is highly ordered, and so your diversion about the special case of such an "ordering mechanism" within an open system is irrelevant to the question - how did such a mechanism arise to begin with?

This is a rhetorical question, since we know from the 2nd Law that such an ordered mechanism cannot arise without some other ordering mechanism to produce it. I'm speaking in non-technical terms here - but you get the point. There are only two options, either reject evolutionary biology and hold on to the 2nd Law, or reject the 2nd Law and hold onto evolutionary biology.

I'm sticking to the tried and tested 2nd Law. And that there is a difference between what I am currently writing, and a well-shaken marble bag.

Dr. Arend Hintze said...

There is a difference between a well shaken marble bag and a wall of hieroglyphs, the bag has max entropy, and the hieroglyphs have not. But once you put energy in the bag, and one of the marbles is a self replicator, you move from uniform distribution to a lot of white (self replicators) and few other balls. That would be less entropy in the bag.

Just do the following:
Take the bag (well mixed uniform distributed balls) and count the white and blacks, the probability to find black is 0.5 like the one for whites is 0.5
Entropy = -(0.5 * log2 0.5+0.5 log2 0.5) = 1 = max entropy.
Now put energy in the bag in the following form:
Take out two marbles, and if one is black and one is white, but back two white, otherwise return both marbles. You will get more and more white and less and less black, say we stop at some time and check how many black and how many whites you have say: p(white)=0.8 and p(black)=0.2 so entropy=
-(0.8 * log2 0.8+0.2 log2 0.2)=0.72 less entropy!
Max entropy is 1 so information is max entropy - measured, so the bag after the sorting (which took energy) contain 0.28 bit of information.
But you might ask about what? And the answer is: About the sorting process.
Exactly as letters and their distribution in word contain information about language.

I understand the difficulty of 2nd law vs. abiogenesis, but that is from noise to self replicator, which has nothing to do with energy + self replicator = information.

The explanation here is that perturbed systems with high entropy might self organize, and self organization (using energy) again makes information and sometimes self replicators. But self organization is not very well understood.

Cheers Arend

Samuel Watterson said...

"energy + self replicator = information."

This is related to our different definitions of information. I understand information to be more than simply data. Define what "random" means, and then I will explain how information differs from mere data. I have no problem with "energy + self-replicator = data." In fact I don't even have problem with "energy + replicator = information." if you don't mean "new information."

Energy can be used to replicate, transfer, and manipulate information in various ways, and the 2nd Law means that the tendency is always for the information to decrease, but by no stretch of the imagination can new information ever be produced.

I think the reason that you don't want to concede the difference between data and information is that this concedes that information must always have an intelligent sender (the principle by which SETI operates), and since every living organism carries a considerable amount information, then pure materialistic, evolutionary biology cannot be maintained.

Dr. Arend Hintze said...

Okay, let us just for a moment ignore that information is already defined. And I indeed would like you to explain the difference between what you call data and information.

There are two possible definitions: a random event is unpredictable, or only as predictable as without any information about that event. Thus randomness is the absence of predictability.

Or in terms of TD:
Randomness is the process that generates entropy. Where ever you find entropy it was caused by a random process.

I have a hunch of what you next argument will be, but I don't want to speculate, just keep in mind that a random process also requires energy... without energy you have no motion, no rearrangement, no randomness. There is a reason that a system in stasis is called 0 entropy system, what people forget to say is that it also means 0 information, since a system in stasis allows only one symbol at any given time.

Anyway I am honestly curious about your definition of data and not information, since complexity is only defined for information, and Kolmogorov's definition of "data" complexity is super impractical.

Cheers Arend