• Starting today August 7th, 2024, in order to post in the Married Couples, Courting Couples, or Singles forums, you will not be allowed to post if you have your Marital status designated as private. Announcements will be made in the respective forums as well but please note that if yours is currently listed as Private, you will need to submit a ticket in the Support Area to have yours changed.

Unsatisfactory Scientific Explanations?

Justatruthseeker

Newbie
Site Supporter
Jun 4, 2013
10,132
996
Tulsa, OK USA
✟177,504.00
Country
United States
Gender
Male
Faith
Non-Denom
Marital Status
Widowed
Politics
US-Others
Yes, of course. The point is just that a learning system can learn a language by example without any prior linguistic coding.

No it can't - those recognition programs contain linguistic coding - that it compares those images to until a match is found - or best estimate.

Improvement is a subjective term. If a mutation allows bacteria to better survive antibiotics, it's not an improvement from our point of view, but if the bacteria had a point of view, it would disagree; if a mutation makes a yeast help brew a better flavoured beer, its debatable whether that's an improvement for either the yeast or us; brewers & beer drinkers might say so, and the yeast would be bred in huge numbers but most would be killed afterwards - swings & roundabouts.

If a mutation in a plant makes its flower more visible or attractive to a pollinating insect, or a mutation in the insect makes it better at recognising the flower, or stronger in competition with mates, or able to hide better from predators, it can be viewed as an improvement for the mutated organism.

Nobody is saying that mutations are necessarily advantageous; current opinion is that of mutations in a population, the vast majority are neutral; of the remainder, most are maladaptive, and only a few are advantageous - but those few can and do make a difference where reproductive advantage is involved. There are also quite a few that work both ways - disadvantageous in some conditions and advantageous in others (e.g. thalassemia, heterozygous sickle-cell trait, etc).

None of them are advantageous - a mutation to your eye will NEVER be passed on to your offspring, because it did not occur in the reproductive genomes. And any mutation that occurs in the reproductive genomes will be useless to the current host. So the ideas of mutations adding cumulative over successive generations is a straight out fantasy. The problem is that biologists now consider any permanent change to the genome from previously a mutation - even if that change is the result of ordinary dominant and recessive genes.

That's just it. You will harp on mutations as the cause of variation - while ignoring the observational evidence. Because you can't fit your theory to the observational evidence. Despite your claims of mutation E coli in the lab remained exactly what they started as - E coli and will forever remain E coli - because they receive no genomes from another infraspecific taxa within the bacterium species to which they belong.

http://www.christianforums.com/thre...an-evolutionist.7916357/page-15#post-68891501

All that has occurred is that dominant and recessive genes have become dominant or recessive. Mutation adds NO new information but what already existed. It may be written in a different format - but nothing new was added that did not already exist, or would not have come about naturally over time.

In every single experiment with actual breeding animals and plants with mutation - your theory has been falsified.

http://www.weloennig.de/Loennig-Long-Version-of-Law-of-Recurrent-Variation.pdf

Mutation soon reaches a limit after which no knew forms are EVER produced. But you ignore 70+ years in plant and animal husbandry too, the only place mutation has been stidied with actual breeding animals and plants.

Mutations are DAMAGE - ERRORS in the code and nothing more. Yes - I am also not arguing mutation does not once in a blue moon accidentally help an organism - but again - as has been shown in plant and animal husbandry - that small benefit is usually outweighed by the damage it causes to the organism in other ways.

No, you confuse me with someone else - I truly do accept mutations - I just accept what they really are.

https://www.google.com/search?q=bir...ved=0ahUKEwjyvMWe9KfJAhVEo4gKHanMCFoQ_AUIBigB
 
Upvote 0

Justatruthseeker

Newbie
Site Supporter
Jun 4, 2013
10,132
996
Tulsa, OK USA
✟177,504.00
Country
United States
Gender
Male
Faith
Non-Denom
Marital Status
Widowed
Politics
US-Others
The comparison between neural nets and biological brains is useful to a degree, but the assumption that they should be equivalent if they work on generally the same principles is misguided.

Lets, instead, look at differences between various brains and computers.

Both brains and computers work with logic gates. The logic gates of brains are slow though. They can only fire dozens to hundreds of times per second. (Let's call the upward bound 200 hz) Computers, on the other hand, can operate at orders of magnitude greater speed. So how can a 2000000000hz computer lose out to a 200hz brain? The brain is massively parallel. We have on the order of 100 trillion synapses in the human brain. No computer even approaches this.

Computers do lots of different jobs though. Aren't some better suited to massively parallel processing? Yes. Video cards, for example, operate at much slower speeds, but can process vastly more data in parallel. We can use the faster cpu to process data, but there is a big performance hit there because it's got to do it more linearly. We can likewise simulate the even more massively parallel brain in a computer, but we take a performance hit just like we do using a CPU as a GPU.

But can we simulate a simpler brain or a part of a brain? Yes. We can. We can pretty much exactly replicate the functions of a number of simple organisms and, as discussed earlier, we are working on parts of more complex brains.

Agreed - but that computer will never write random code to its operating system and do anything but shut down.
 
Upvote 0

[serious]

'As we treat the least of our brothers...' RIP GA
Site Supporter
Aug 29, 2006
15,100
1,716
✟95,346.00
Faith
Non-Denom
Marital Status
Married
Agreed - but that computer will never write random code to its operating system and do anything but shut down.
No where in my post did I bring up computer code.

The failure here is in the poor analogy between DNA and computer programing languages.
 
Last edited:
Upvote 0

[serious]

'As we treat the least of our brothers...' RIP GA
Site Supporter
Aug 29, 2006
15,100
1,716
✟95,346.00
Faith
Non-Denom
Marital Status
Married
As a way to bridge the issues with the poor analogy between DNA and computer programing languages, the best bet would be to create a programing language that includes relevant aspects of DNA code.

A first approach of requirements for our new programing language:

1. All series of arbitrary characters must compile
2. Similar codes should produce similar functions (other than frames hit issues)

What key features am I still missing?

For simplicity sake, let's assume we are working with a microcontroler.
 
Upvote 0

whois

rational
Mar 7, 2015
2,523
119
✟3,336.00
Faith
Non-Denom
if you are going to take the computer approach, then you need to envision DNA as software running on the machine of the nucleus.
i don't think you can make a 1 to 1 comparison with current technology.

if you use current technology then DNA can be seen as a serial transmission of data with start and stop bits.
this data is then assembled into the functioning code which is used by the computer.
the thing with genes is that they can code for more than one trait and the CPU must determine what these are.

the key features i see that are missing are the effects of mutated genes.
some mutations will cause an aborted birth while others will cause deformities (some severe) or maybe no effect at all.
as i see it, we have no way to determine these unknowns.
this is one area i don't think computers can be of much help, they simply cannot model the unknown.
 
Upvote 0

Justatruthseeker

Newbie
Site Supporter
Jun 4, 2013
10,132
996
Tulsa, OK USA
✟177,504.00
Country
United States
Gender
Male
Faith
Non-Denom
Marital Status
Widowed
Politics
US-Others
No where in my post did I bring up computer code.

The failure here is in the poor analogy between DNA and computer programing languages.

DNA is an operating code. A programming language written by God. Your denial of this will not change that fact.

It is so far beyond binary code that we are unable to fathom it. It is not a 2 bit language (0 and 1) but a 4 bit code (T, C, G, A).

Now imagine what you could do with that computer if you were capable of programming it with a 4 bit code instead of merely 2. Do you then believe it would still be as limited in it's abilities as it is with that simple 2 bit code?

The failure here is in realizing that God wrote an operating system so far beyond our understanding that we are still dabbling in 2 bit code - thinking we are brilliant.
 
Upvote 0

Justatruthseeker

Newbie
Site Supporter
Jun 4, 2013
10,132
996
Tulsa, OK USA
✟177,504.00
Country
United States
Gender
Male
Faith
Non-Denom
Marital Status
Widowed
Politics
US-Others
As a way to bridge the issues with the poor analogy between DNA and computer programing languages, the best bet would be to create a programing language that includes relevant aspects of DNA code.

A first approach of requirements for our new programing language:

1. All series of arbitrary characters must compile
2. Similar codes should produce similar functions (other than frames hit issues)

What key features am I still missing?

For simplicity sake, let's assume we are working with a microcontroler.

What is missing is that DNA is not a simple 2 bit code. When you have a 4 bit operating system - come talk to me then - until then - we will sit and try to compare 2 bit code to 4 bit code and wonder why we are missing a billion other possible combinations.
 
Upvote 0

[serious]

'As we treat the least of our brothers...' RIP GA
Site Supporter
Aug 29, 2006
15,100
1,716
✟95,346.00
Faith
Non-Denom
Marital Status
Married
When you have a 4 bit operating system - come talk to me then -
You realize windows 10 is a 64 bit operating system, right? Most from a decade ago were 32 bit. 4 bit computers date back to at least the 70s.
 
Upvote 0

whois

rational
Mar 7, 2015
2,523
119
✟3,336.00
Faith
Non-Denom
DNA is an operating code.
if we make the software assumption for DNA, then DNA is the software that you use on a computer.
the operating system is the nucleus of the cell
It is so far beyond binary code that we are unable to fathom it. It is not a 2 bit language (0 and 1) but a 4 bit code (T, C, G, A).
if we assume a machine code level then DNA can be divided into genes which consists of 3 bits each, each bit taking 1 of 4 possibilities.
current technology on the machine level is of fixed length, each instruction is a fixed number of bytes.
this wouldn't be true with DNA code, each "instruction" is delineated by start and stop bits.
another difference is that genes can code for more than one trait
Now imagine what you could do with that computer if you were capable of programming it with a 4 bit code instead of merely 2. Do you then believe it would still be as limited in it's abilities as it is with that simple 2 bit code?
you can't compare computers with DNA code in this regard because DNA code is a variable byte length code, each "instruction" is of a different length. this isn't true with computers, each instruction is a fixed length.
The failure here is in realizing that God wrote an operating system so far beyond our understanding that we are still dabbling in 2 bit code - thinking we are brilliant.
what happens if we break the code?
what if DNA really is an encrypted cipher.
will anything ever become of this stuff?
it will never see the light of day if it somehow disproves evolution, you can count on it.
 
Upvote 0

Justatruthseeker

Newbie
Site Supporter
Jun 4, 2013
10,132
996
Tulsa, OK USA
✟177,504.00
Country
United States
Gender
Male
Faith
Non-Denom
Marital Status
Widowed
Politics
US-Others
You realize windows 10 is a 64 bit operating system, right? Most from a decade ago were 32 bit. 4 bit computers date back to at least the 70s.

I am not even going to deign that with a reply - because you will simply ignore the science when I present it to you - as you ignore the science in every post when presented to you by me - so instead I will let someone else here explain what is meant by a 64 bit operating system. Just understand it has to do with chip architecture - memory addresses and bit size - and nothing to do with the actual code running the system.
 
Upvote 0

Justatruthseeker

Newbie
Site Supporter
Jun 4, 2013
10,132
996
Tulsa, OK USA
✟177,504.00
Country
United States
Gender
Male
Faith
Non-Denom
Marital Status
Widowed
Politics
US-Others
if we make the software assumption for DNA, then DNA is the software that you use on a computer.
the operating system is the nucleus of the cell

No, because the nucleus of the cell does not exist until that code is first read and assembled. The cell is like the one of the peripherals which runs according to the instructions given it - and nothing more. The cell does exactly what the code in the DNA tells it to do - following it's instructions to the ummmm - letter. ;)

if we assume a machine code level then DNA can be divided into genes which consists of 3 bits each, each bit taking 1 of 4 possibilities.
current technology on the machine level is of fixed length, each instruction is a fixed number of bytes.
this wouldn't be true with DNA code, each "instruction" is delineated by start and stop bits.
another difference is that genes can code for more than one trait

Again no, one strand of DNA has every combination possible already set - it requires no external input to begin operating.

you can't compare computers with DNA code in this regard because DNA code is a variable byte length code, each "instruction" is of a different length. this isn't true with computers, each instruction is a fixed length.

Again no. You are simply refusing to accept that like computer code which is segmented into parts that operate certain functions - so too is DNA. A computer with just the instruction set for operating the graphics card (eye) is useless without the entire code. But then DNA is not a simple 2 bit operating system - it is 4 bits - capable of adapting as needed.

what happens if we break the code?
You get birth defects - it no longer operates correctly - just as if you break computer code.

what if DNA really is an encrypted cipher.

Anything is possible in what we do not yet understand.

will anything ever become of this stuff?
it will never see the light of day if it somehow disproves evolution, you can count on it.

Oh I believe that totally - the science would then be ignored as they ignore the science when it comes to evolution. It's already been disproved - they simply try to double-talk their way around the science.
 
Last edited:
Upvote 0

[serious]

'As we treat the least of our brothers...' RIP GA
Site Supporter
Aug 29, 2006
15,100
1,716
✟95,346.00
Faith
Non-Denom
Marital Status
Married
I am not even going to deign that with a reply - because you will simply ignore the science when I present it to you - as you ignore the science in every post when presented to you by me - so instead I will let someone else here explain what is meant by a 64 bit operating system. Just understand it has to do with chip architecture - memory addresses and bit size - and nothing to do with the actual code running the system.
You asked for a 4 bit OS. Windows is a 64 bit OS.
 
Upvote 0

lesliedellow

Member
Sep 20, 2010
9,654
2,582
United Kingdom
Visit site
✟119,577.00
Faith
Calvinist
Marital Status
Single
Politics
UK-Liberal-Democrats
I am not even going to deign that with a reply - because you will simply ignore the science when I present it to you - as you ignore the science in every post when presented to you by me - so instead I will let someone else here explain what is meant by a 64 bit operating system. Just understand it has to do with chip architecture - memory addresses and bit size - and nothing to do with the actual code running the system.

Really? So when I write "sub rax, 7B419D45AD03h," that has got nothing to do with rax being a 64 bit register?
 
Upvote 0

lesliedellow

Member
Sep 20, 2010
9,654
2,582
United Kingdom
Visit site
✟119,577.00
Faith
Calvinist
Marital Status
Single
Politics
UK-Liberal-Democrats
Actlually, it has nothing to do with the number of bits in the register. A sixty-four bit register holds the same range of values as eight hexadecimal digits, or eight bytes. To deal with "7B419D45AD03h" you would have to load the register twice.
By the way, the IBM 1401 system that I used to run back in the sixties had an eight bit byte, but one bit was a parity bit, and one bit was a "word mark" which would indicate the beginning of an instruction or field of information. It was not even pure binary, because it ran what was called BCD (binary coded decimal), so that in binary the number seven would be coded "10001111b". The number one would be "10000011b", the number three as "10000111b" and the number thirty-one as two bytes, "10000111b, 00000010b". The last bit of each byte was a "check bit" or "parity bit" to catch a hardware error if one bit were copied wrong. I seem to remember it ran on odd parity, but it might have been even. Instructions could be of variable length. I remember that "read a card" was "1", punch a card was "2" and "write a line on the printer was "3". It was a very powerful machine and actually had 16k bytes of core memory. The idea of self-modifying code was toyed with but discarded because of the difficulties of documentation, or so we were told by our bosses at NSA.
Some thought that those "bosses" were actually a computer, but this was largely disbelieved by the cognoscenti, on the grounds that no computer could be that stupid.

:wave:

According to the AMD manual, we are both wrong. With an immediate operand you can do a sign extended 32 bit subtraction from a 64 bit register, or you can subtract a 64 bit register from a 64 bit register, or you can subtract a 64 bit memory operand from a 64 bit register. A 64 bit register holds the same range of values as 16 hexadecimal digits:

0xF - 4 bits
0xFF - 8 bits
0xFFFF - 16 bits
0xFFFFFFFF - 32 bits
0xFFFFFFFFFFFFFFFF - 64 bits
 
Last edited:
Upvote 0

FrumiousBandersnatch

Well-Known Member
Mar 20, 2009
15,405
8,143
✟349,282.00
Faith
Atheist
No it can't - those recognition programs contain linguistic coding - that it compares those images to until a match is found - or best estimate.
I was referring to the Annabell network, which - as already described - is notable for having no linguistic coding.
None of them are advantageous - a mutation to your eye will NEVER be passed on to your offspring, because it did not occur in the reproductive genomes. And any mutation that occurs in the reproductive genomes will be useless to the current host.
When discussing mutations and evolution, it's assumed that they are heritable mutations, for that very reason. A beneficial mutation doesn't have to benefit the individual in whom it first appears. As long as that individual reproduces, it's the viable offspring that benefit and propagate any advantage through the population over many generations.
So the ideas of mutations adding cumulative over successive generations is a straight out fantasy.
Clearly not if the mutation is heritable and advantageous.
The problem is that biologists now consider any permanent change to the genome from previously a mutation - even if that change is the result of ordinary dominant and recessive genes.
It's not clear what you mean - it is the alleles of a gene that are dominant or recessive; the difference is caused by mutation at some point in the ancestral tree of one or other (or both) contributors.
That's just it. You will harp on mutations as the cause of variation - while ignoring the observational evidence.
In discussing genetics, variation usually refers to genetic variation. Mutations change DNA; when it changes genes, that is, by definition, genetic variation. But if you want to throw the net more widely, DNA controls development & function. Differences in development & function between creatures is variation between creatures.
Because you can't fit your theory to the observational evidence.
The theory is based on observational evidence (also, it's not my theory, I just think it's right).
Despite your claims of mutation E coli in the lab remained exactly what they started as - E coli and will forever remain E coli - because they receive no genomes from another infraspecific taxa within the bacterium species to which they belong.
Individual mutations are not speciation. It would take a large number of significant mutations before a population of E.Coli was sufficiently different to justify a new species name; it seems unlikely to be observed in-vivo, given the relative consistency of their environment (the gut). Not being a microbiologist, I don't know the criteria that would apply to that kind of decision.
All that has occurred is that dominant and recessive genes have become dominant or recessive. Mutation adds NO new information but what already existed. It may be written in a different format - but nothing new was added that did not already exist, or would not have come about naturally over time.
It's the alleles of a gene that are dominant or recessive. You really need to get a grasp of basic genetics, how genes, alleles, inheritance and mutation are related. I can't teach you that here, but there are good, clear sources online, such as the Learn Genetics site - or Wikipedia.
In every single experiment with actual breeding animals and plants with mutation - your theory has been falsified.
Nope - as I said, single mutations don't make a new species. Cross-breeding varieties doesn't make new species. Also, speciation has been been observed both in the lab and in the wild.
 
Upvote 0

Gracchus

Senior Veteran
Dec 21, 2002
7,199
821
California
Visit site
✟30,682.00
Faith
Pantheist
Marital Status
Single
Politics
US-Others
According to the AMD manual, we are both wrong.
0xF - 4 bits
0xFF - 8 bits
0xFFFF - 16 bits
0xFFFFFFFF - 32 bits
0xFFFFFFFFFFFFFFFF - 64 bits
Yes, which is why I withdrew my post for editing. On the other hand, it is really off-topic, so I will not post the corrected version.

:sorry:
 
Upvote 0

whois

rational
Mar 7, 2015
2,523
119
✟3,336.00
Faith
Non-Denom
No, because the nucleus of the cell does not exist until that code is first read and assembled. The cell is like the one of the peripherals which runs according to the instructions given it - and nothing more. The cell does exactly what the code in the DNA tells it to do - following it's instructions to the ummmm - letter. ;)
the nucleus always exists, the only time it doesn't is when life first came about.
Again no, one strand of DNA has every combination possible already set - it requires no external input to begin operating.
you missed the point.
the instructions of DNA are of a varying number of bytes.
computers use a fixed length instruction.
the computers i grew up with used 2 bytes, opcode and operand.
my question is, which one of these is DNA?
 
Upvote 0