• Starting today August 7th, 2024, in order to post in the Married Couples, Courting Couples, or Singles forums, you will not be allowed to post if you have your Marital status designated as private. Announcements will be made in the respective forums as well but please note that if yours is currently listed as Private, you will need to submit a ticket in the Support Area to have yours changed.

Unsatisfactory Scientific Explanations?

Gracchus

Senior Veteran
Dec 21, 2002
7,199
821
California
Visit site
✟30,682.00
Faith
Pantheist
Marital Status
Single
Politics
US-Others
the nucleus always exists, the only time it doesn't is when life first came about.
Life is just chemistry, a dynamic pattern of reactions. Just to clarify: Vitalism is defunct.
you missed the point.
The point is that the similarities of DNA and computer programming is metaphorical.
the instructions of DNA are of a varying number of bytes.
No, in DNA the instruction is exactly three codons. From the amino acids variable length proteins are assembled, often by different sections of DNA, perhaps even sections from different chromosomes. There is no underlying design, just tinkering, no optimal design, just make-do.
computers use a fixed length instruction.
It is not quite that simple. The old IBM 1401 had "machine language" instructions of varying lengths. And those programmable codes were executed by hard-wired procedures.
the computers i grew up with used 2 bytes, opcode and operand.
That is a matter of architecture. There are all sorts of ways to build and program computers. For instance you could interpret at the bit level, "010o" as one and "111o" as zero, thus ruling out one bit errors because an error of one bit would change the parity, from odd to even. Then you could use "100o" as a start signal and "001o" as a stop signal. This would be a form of binary coded octal.It takes up more storage and involves a bit more time to process but adds a layer of error detection.
my question is, which one of these is DNA?
Neither. Computers are not living systems. And analogies and metaphors are more like poetry than science. But think of it this way if you like: It is the proteins that determine the machine functions. The DNA/RNA is the micro-code that compiles not only to the machine language commands, but also to the machine structure.. The analogy is not good, because cells and computers are very different. That isn't quite right, or really even nearly right if you examine too closely. In living cells there are many more layers of complexity than in computers. Closer would be that cells are computers that can modify not only their software but their hardware, in response to inputs. That's how you build a true AI.

Just sayin'! (Probably more than I should have!)

:wave:
 
Last edited:
Upvote 0

Justatruthseeker

Newbie
Site Supporter
Jun 4, 2013
10,132
996
Tulsa, OK USA
✟177,504.00
Country
United States
Gender
Male
Faith
Non-Denom
Marital Status
Widowed
Politics
US-Others
Let's get this 64 bit confussion cleared up right now,

https://en.wikipedia.org/wiki/64-bit_computing

"In computer architecture, 64-bit computing is the use of processors that have datapath widths, integer size, and memory address widths of 64 bits (eight octets). Also, 64-bit CPU and ALU architectures are those that are based on registers, address buses, or data buses of that size. From the software perspective, 64-bit computing means the use of code with 64-bit virtual memory addresses."

"A 64-bit register can store 264 (over 18 quintillion or 1.8×1019) different values. Hence, a processor with 64-bit memory addresses can directly access 264 bytes (=16 exbibytes) of byte-addressable memory."

A 64 bit operating system just allows the use of larger datapath widths - allowing more data to be processed at once - and has nothing to do with the actual 2 bit binary code that runs the computer. You are trying to apply data architecture to the actual code that runs the machine.
 
Upvote 0

Justatruthseeker

Newbie
Site Supporter
Jun 4, 2013
10,132
996
Tulsa, OK USA
✟177,504.00
Country
United States
Gender
Male
Faith
Non-Denom
Marital Status
Widowed
Politics
US-Others
the nucleus always exists, the only time it doesn't is when life first came about.

And the nucleus comes about because the DNA tells it how to make itself.

http://www.cell.com/abstract/0092-8674(83)90132-0

you missed the point.
the instructions of DNA are of a varying number of bytes.
computers use a fixed length instruction.
the computers i grew up with used 2 bytes, opcode and operand.
my question is, which one of these is DNA?

Not true.

https://en.wikipedia.org/wiki/64-bit_computing
"In 32-bit programs, pointers and data types such as integers generally have the same length; this is not necessarily true on 64-bit machines"

So even now we are beginning to make use of variable length data instructions.
 
Upvote 0

Justatruthseeker

Newbie
Site Supporter
Jun 4, 2013
10,132
996
Tulsa, OK USA
✟177,504.00
Country
United States
Gender
Male
Faith
Non-Denom
Marital Status
Widowed
Politics
US-Others
I was referring to the Annabell network, which - as already described - is notable for having no linguistic coding.
When discussing mutations and evolution, it's assumed that they are heritable mutations, for that very reason. A beneficial mutation doesn't have to benefit the individual in whom it first appears. As long as that individual reproduces, it's the viable offspring that benefit and propagate any advantage through the population over many generations.
Clearly not if the mutation is heritable and advantageous.
It's not clear what you mean - it is the alleles of a gene that are dominant or recessive; the difference is caused by mutation at some point in the ancestral tree of one or other (or both) contributors.
In discussing genetics, variation usually refers to genetic variation. Mutations change DNA; when it changes genes, that is, by definition, genetic variation. But if you want to throw the net more widely, DNA controls development & function. Differences in development & function between creatures is variation between creatures.
The theory is based on observational evidence (also, it's not my theory, I just think it's right).
Individual mutations are not speciation. It would take a large number of significant mutations before a population of E.Coli was sufficiently different to justify a new species name; it seems unlikely to be observed in-vivo, given the relative consistency of their environment (the gut). Not being a microbiologist, I don't know the criteria that would apply to that kind of decision.
It's the alleles of a gene that are dominant or recessive. You really need to get a grasp of basic genetics, how genes, alleles, inheritance and mutation are related. I can't teach you that here, but there are good, clear sources online, such as the Learn Genetics site - or Wikipedia.
Nope - as I said, single mutations don't make a new species. Cross-breeding varieties doesn't make new species. Also, speciation has been been observed both in the lab and in the wild.

But it only learns what an intelligence tells it.

ANNABELL does not have pre-coded language knowledge; it learns only through communication with a human interlocutor.

It learns as words are input. It then compares new words to the existing words that have been stored after input. You are simply programming it on the fly - thinking it is learning, when it knows nothong but what it has previously been told.

If you told it green meant red - it would become confused over time - and would never figure out by itself why if green is red - red is not green. It only knows what you input into it - and is incapable of learning without that human teaching it.

Deaf twins on the other hand have worked out ways to communicate with each other in their own devised language - without the need for instruction.

https://en.wikipedia.org/wiki/Cryptophasia

Now stick two identical machines together with ANNABELL and show me they can develop communication between them without any human input or pre-programmed linguistics?????

I say they will sit there like a lump of metal and wait for the other to begin communication which will never happen since the other is also waiting for communication to begin.

It only "appears" to be learning" simply a sly trick of pre-programmed code telling it to pay attention to how words are used and with other words. You may think it learns like a human - but you would be deceiving yourself for believing that. It will never be able to make leaps of logic on its own. Never once dream of flowers or know their smell, even if it may talk to you about the smell of flowers because someone told it what flowers smelt like to them, It is an illusion of reality, nothing more.

If you tell it Hello at the start it might respond with hello, but will not understand why you said hello or why it should respond with hello, besides being told to do so by pre-programmed code. That it may learn to respond in the correct manner - does not mean it actually understands the meaning of its responses.
 
Last edited:
Upvote 0

lesliedellow

Member
Sep 20, 2010
9,654
2,582
United Kingdom
Visit site
✟119,577.00
Faith
Calvinist
Marital Status
Single
Politics
UK-Liberal-Democrats
Let's get this 64 bit confussion cleared up right now,

https://en.wikipedia.org/wiki/64-bit_computing

"In computer architecture, 64-bit computing is the use of processors that have datapath widths, integer size, and memory address widths of 64 bits (eight octets). Also, 64-bit CPU and ALU architectures are those that are based on registers, address buses, or data buses of that size. From the software perspective, 64-bit computing means the use of code with 64-bit virtual memory addresses."

"A 64-bit register can store 264 (over 18 quintillion or 1.8×1019) different values. Hence, a processor with 64-bit memory addresses can directly access 264 bytes (=16 exbibytes) of byte-addressable memory."

A 64 bit operating system just allows the use of larger datapath widths - allowing more data to be processed at once - and has nothing to do with the actual 2 bit binary code that runs the computer. You are trying to apply data architecture to the actual code that runs the machine.

Well not quite, because the x86-64 processors have 64 bit registers, but have only a 48 bit wide address bus, and have a 56 bit virtual address space.
 
Upvote 0

Justatruthseeker

Newbie
Site Supporter
Jun 4, 2013
10,132
996
Tulsa, OK USA
✟177,504.00
Country
United States
Gender
Male
Faith
Non-Denom
Marital Status
Widowed
Politics
US-Others
Well not quite, because the x86-64 processors have 64 bit registers, but have only a 48 bit wide address bus, and have a 56 bit virtual address space.

Which has nothing at all to do with the 2 bit binary operating system - just how many calculations can be done and the amount of memory that can be addressed at one time.
 
Upvote 0

lesliedellow

Member
Sep 20, 2010
9,654
2,582
United Kingdom
Visit site
✟119,577.00
Faith
Calvinist
Marital Status
Single
Politics
UK-Liberal-Democrats
Which has nothing at all to do with the 2 bit binary operating system - just how many calculations can be done and the amount of memory that can be addressed at one time.

A sixty four bit full adder has 64 bits, not two. A processor can do as many calculations simultaneously as it has arithmetic logic units or floating point units. Most likely a modern processor will be making a memory access simultaneously with some operation on data currently held in its registers, or in the L1 cache.
 
Upvote 0

Gracchus

Senior Veteran
Dec 21, 2002
7,199
821
California
Visit site
✟30,682.00
Faith
Pantheist
Marital Status
Single
Politics
US-Others
I think you mean each codon is 3 nucleotides.
The metaphor under discussion was comparing DNA to computer code. In that metaphor, each codon, three nucleotides, would be an instruction coding for an amino acid or a stop code.

:wave:
 
Upvote 0

[serious]

'As we treat the least of our brothers...' RIP GA
Site Supporter
Aug 29, 2006
15,100
1,716
✟95,346.00
Faith
Non-Denom
Marital Status
Married
The metaphor under discussion was comparing DNA to computer code. In that metaphor, each codon, three nucleotides, would be an instruction coding for an amino acid or a stop code.

:wave:
Yeah, I got that, just making sure that everything is clearer for those who might not have the same training in biology.
 
Upvote 0

whois

rational
Mar 7, 2015
2,523
119
✟3,336.00
Faith
Non-Denom
Let's get this 64 bit confussion cleared up right now,

https://en.wikipedia.org/wiki/64-bit_computing

"In computer architecture, 64-bit computing is the use of processors that have datapath widths, integer size, and memory address widths of 64 bits (eight octets). Also, 64-bit CPU and ALU architectures are those that are based on registers, address buses, or data buses of that size. From the software perspective, 64-bit computing means the use of code with 64-bit virtual memory addresses."

"A 64-bit register can store 264 (over 18 quintillion or 1.8×1019) different values. Hence, a processor with 64-bit memory addresses can directly access 264 bytes (=16 exbibytes) of byte-addressable memory."

A 64 bit operating system just allows the use of larger datapath widths - allowing more data to be processed at once - and has nothing to do with the actual 2 bit binary code that runs the computer. You are trying to apply data architecture to the actual code that runs the machine.
it seems the definitions have changed somewhat.
i was taught that 8 bit machines, 16 bit machines, etc, referred to the data bus width.
the 8080 had an 8 bit data bus but its address bus was 16 bits, it was an 8 bit CPU.
this processor also used a multiplexed bus, it didn't have a separate data bus, the lower 8 bits of the address bus served that purpose.

the 6809 was one of the first processors that attempted 16 performance, but it too was an 8 bit CPU but had 16 bit registers.

the above processors are "ancient history" as far as CPUs go, late 70s
 
Upvote 0

Ratjaws

Active Member
Jul 1, 2003
272
37
69
Detroit, Michigan
Visit site
✟24,722.00
Faith
Catholic
whois said:
it seems astounding, but remember, a computer is about as smart as a brick even you must realize that, if you understand the technology.
Frumious replied:
I understand the technology, but I don't really know what you mean - and I don't think you do either. A computer is just computational substrate - stuff that can process information; neurons in the brain are just computational substrate too - they must be organised, connected together, and trained, in order to be 'smarter than a brick'. A neural network emulation on a computer must be organised, connected together, and trained, in order to be 'smarter than a brick'. A brain may be orders of magnitude more complex than any artificial neural networks we have today, but the underlying principles are the same.
whois said:
current technology will never achieve the processing density of the human mind.
Frumious replied:
I agree; the technology of 5 or 10 years from now probably will - but as I said, that's not the real challenge.
*****************
whois said:
the most telling part though is, the paper neglects a large overhead in computing power.
for example, the machine it runs on, the high level code (that must be written by humans BTW).
Frumious replied:
Well yes; resources are required to process information; why is this 'the most telling part'? your own brain uses roughly 20% of your total energy requirements, and about two thirds of that is for neural signalling (the other third is for maintenance & repair). The code written by humans in this context is the neural network emulation - analogous to the genetic & developmental instructions that direct the structural arrangement of the brain's language areas.
whois said:
more importantly, this "net" must emulate the brain, which i have serious doubts it will ever do.
Frumious replied:
Why 'must' it emulate the brain? It's an exploratory language acquisition model, no more than that. There are other projects whose ultimate aim is a brain emulation (I linked a couple previously), but this isn't one of them.
whois said:
.. computers fail miserably at abstract concepts.
Frumious replied:
What, like learning a language from scratch?
whois said:
computers will never "think".
Frumious replied:
Define 'think'.
whois said:
computers will never have a conscious.
Frumious replied:
I wouldn't put money on it, but you're in good company:

There is no reason anyone would want a computer in their home.


Ken Olson, president, chairman and founder of Digital Equipment Corp. (DEC)

Heavier-than-air flying machines are impossible.


Lord Kelvin, British mathematician and physicist, president of the British Royal Society

Fooling around with alternating current is just a waste of time. Nobody will use it, ever.


Thomas Edison

The energy produced by the breaking down of the atom is a very poor kind of thing. Anyone who expects a source of power from the transformation of these atoms is talking moonshine.

Ernest Rutherford
whois said:
...these will require a fundamental breakthrough in computer technology.
Frumious replied:
Already addressed.
Frumious & whois,
I read with fasination the dialogue between you two. What I find is both of you comparing computer architecture to the human brain. Unfortunately in doing so you both use terms loosely. You also make claims that may be possible but again, unfortunately those that are surely impossible. At times you seem to be aware of the inherent difference between an artificial machine and biological gray matter. You seem to agree that there is a great difference in complexity of structure that translates to the inabilitiy of a computer to "think" in an abstact way. You also seem to believe that coming up with a new computer structure that more closely mimmics the human brain will allow artificial machines to imitate, possibly even at an essential level, and become human in it's ability to think. I have to again strongly disagree with the latter point here.

In speaking of the human "mind" you use the term loosely in that I suspect you consider it the function of the brain. It seems to either of you that thinking is merely a brain in its activity of neural network function. If so it is here I fundamentally disagree, for to me the term "mind" is shorthand for the thinking... that is the abstracting, reasoning, memorizing and the imaginative powers of a human person. Yet no matter how complex we find the human brain to be, or how closely we can relate/map particular kinds of thoughts to an area of the brain matter, it does not have the power of thought. As I've brought out in other posts the human mind works through the brain to in effect give it's matter the power to express thought. In other words there is a mysterious connection between the spiritual soul here (the immaterial form) and matter that causes this body to transcend its natural powers.

So I strongly disagree that the human brain and artificial neural nets have "the underlying principles" to achieve intelligence. I believe current computer technology, as it becomes more complex, could imulate the human brain more closely. I just don't believe they are the same in principle. I believe it was whois who has hinted at this when he makes the case that a completely new and different kind of technology must be developed for computers to become more closely human in their "thinking." I'm not sure how far he would take this but while I suggest a substantially different kind of "computer" might more closely imitate the brain, it could never do so substantially. If it could it would not be a machine but a human organ. Recall the story of Frankenstein here.

Of course this is the first barrier I've talked about in making machines that "think." I think it is in itself an infinite leap. To go from some type of machine, biological or not, to a human organ. Nevertheless even if this barrier could be overcome there is still the second more substantial barrier. I've laid this out before as moving from material form to immaterial form. Again I don't disclaim here that, with time, a machine can more closely imitate human thinking, but the key idea is one of imitation versus doing so in principle. Seems to me whois brings this out with his insistance that computers today are insufficient where it comes to abstraction. I think this is key because this power is one much greater than even that of reasoning. I realize I live in a culture that has been taught reasoning is the ultimate human characteristic that differentiates us from all other living being in this world.

CONTINUED...
 
Upvote 0

Ratjaws

Active Member
Jul 1, 2003
272
37
69
Detroit, Michigan
Visit site
✟24,722.00
Faith
Catholic
CONTINUED...

Yet I find the ability to think much more than just reasoning. Prior to any act of reason (more properly called inference) is the first and simplest act called simple apprehension. This act results in an idea or concept being formed in the mind. In philosophy we define this as an act by which the mind becomes cognizant of an essence without affirming or denying anything about it. Next comes judgment that is an operation of the mind composing and dividing by affirming or denying. Judgment presupposes the first act that forms concepts. To judge requires at least two concepts that must be either confirmed or denied in order to arrive at a new composite or portion of the original ideas. We end with a logical entity in the mind that can be significantly qualified as true or false. A judgement is therefore an act of the mind that can be significantly qualified as true or false.

Now as I've said inference which is reasoning properly so-called, presupposes the simpler acts of mind; simple apprehension by which we obtain concepts and the act of judging which is expressed in proposition. In inference the mind moves from propositions already known to others not initially known but following upon the known ones by a kind of logical necessity. Thus inference is defined as an act of the mind by which from truths already know the mind comes to other truths. These judgments can be mediate or immediate, but the former more properly belongs to reasoning because one judgment leads to another only through the mediation of one or more other judgments. Inference is expressed in language as arguments and considered fruitful if these lead to certitude or prudent opinion concerning important matters not otherwise known. It is fruitless if it leads to mere doubt or ignorance, or to truths either trite or well known apart from the argument. Finally arguments are fallacious if they lead to error.
 
All this to make the point that philosophy in general (and metaphysics in particular), while it might be "heady," is a coherent line of thought that deals with real being just as material (empirical) science does. In fact were I to get deeper into this we would find more and more terms to be defined. This is precisely because language is at the root of philosophy just as it is with material science. Whereas philosophical ideas have to do with essential being material science deals with the non-essential. In your dialogue you talk about how to cause a machine mimmic the human brain which you consider synonymuous with thinking, yet you never define thought. It's one thing to program a computer to handle the task of categorizing say, the animal kingdom. Yet it is entirely another to program it to actually understand such categorization... or why even do it. Consider terms such as "I," "he" and "a"? The human mind can associate these symbols with the real beings they represent and in doing so comprehend why... on the contrary a computer can only memorize the assigned definitions. The computer must always have a human form behind it in order to look human!

When was the last time you saw an animal laugh? Never. This is because it is not in their nature to do so. Yet human persons do because it is their nature to to find incongruency in things. Nor do we find trees or rocks laughing. This is not because they are not as complex but because it is not in their nature. If it were posssible for a computer to appreciate the beauty of a sunset then I'd say it was intelligent. Yet nowhere do we find matter appreciating matter arranged in any particular way precisely because it does not have it within itself to do so (beauty is not just in the eye of the beholder but deeply within the mind behind that eye). No! ...even the most complex forms of living matter that are not human cannot do what humans do so how could a complex, or more complex computer be expected to do so? Arrange it into neural nets and do so a billion billion times more complex than we now have and it still will not be human. It will not think. It could not choose freely as humans do all the time. Artificial machines may imitate these personal characteristics, as animals do to some degree, but they will never take on these human abilities just as animals, plants and inanimate matter never do. Computers will never speak in a way that persons do using language with understanding. Artificial machines will never choose heaven over hell, right over wrong, or love over hate because it is not within their nature. Humans have these powers in potency, material and spiritual, but the rest of nature does not. Computers being produced by intelligent beings will "look" like their makers but never have within them the non-material component they need to think and love, unless we find a way to put a soul in pure matter. I suggest here this is an impossibility even if we were able to use living matter to produce computers as whois suggests. If whois is correct here, and I have my doubts whether this is even possible, computers will still not be more than animals in their capacity to act precisely because of their material only form.

Proof? You want evidence of what I'm saying here? It's all around us! Language being one irrefutable evidence of the impossible barrier one has to scale in order to make machines intelligent and free willed. Language where we allow being outside us (and within) to form (unite) our mind according to it's nature, then using these concepts from within our mind (information) to express those beings, either verbally or in writing, we communicate (cum, with + municare, union... "union with") them; or more precisely unite ourselves with another person while sharing the ideas held within (formed within) our intellect. I finish by saying you guys cheapen what it means to be personal beings when you equate machine "intelligence" to human. You must tread lightly here otherwise you attack the inherent dignity a person has by falsely elevating machine to life and matter to spirit. You also falsely attack me for not giving evidence for my perspective on this subject yet it is you who fail to do so! My proofs are all around us as I've laid out here in cursory so it is simply your blindness to the limitations of the scientific method that keeps you from seeing as the average unscientifically trained person does.

"The most relevant truth about physicalist psychology is still the statement made by Priestley, one of the founders of associationist psychology: 'I see clearly and acknowledge readily, that matter and motion however subtly divided, or reasoned upon, yield nothing more than matter and motion still.' Much of the confusion in today's psychology comes from the fact that physicalists can be so forgetful of such an elementary truth." (quote from a paper by Dr. Stanley Jaki)
 
Last edited:
Upvote 0

[serious]

'As we treat the least of our brothers...' RIP GA
Site Supporter
Aug 29, 2006
15,100
1,716
✟95,346.00
Faith
Non-Denom
Marital Status
Married
Upvote 0

whois

rational
Mar 7, 2015
2,523
119
✟3,336.00
Faith
Non-Denom
Frumious & whois,
I read with fasination the dialogue between you two. What I find is both of you comparing computer architecture to the human brain. Unfortunately in doing so you both use terms loosely. You also make claims that may be possible but again, unfortunately those that are surely impossible. At times you seem to be aware of the inherent difference between an artificial machine and biological gray matter. You seem to agree that there is a great difference in complexity of structure that translates to the inabilitiy of a computer to "think" in an abstact way. You also seem to believe that coming up with a new computer structure that more closely mimmics the human brain will allow artificial machines to imitate, possibly even at an essential level, and become human in it's ability to think. I have to again strongly disagree with the latter point here.
the discussion seems to be between a simulation of the brain and some type of analogy for DNA/cell machine.
the former is in relation to AI, simulations.
the latter is in regard to a software/ hardware approach to the cell.
In speaking of the human "mind" you use the term loosely in that I suspect you consider it the function of the brain. It seems to either of you that thinking is merely a brain in its activity of neural network function. If so it is here I fundamentally disagree, for to me the term "mind" is shorthand for the thinking...
there is no way around this, a brain is required to think.
thinking, or your thoughts, do not come from nowhere else but your own mind.
this is probably the major objection to the god scenario, it's simply inconceivable that thoughts can originate without a brain.
As I've brought out in other posts the human mind works through the brain to in effect give it's matter the power to express thought. In other words there is a mysterious connection between the spiritual soul here (the immaterial form) and matter that causes this body to transcend its natural powers.
i've often thought along similar lines in regards to life itself.
it isn't the molecules that are alive, but the configuration of these molecules allows life to manifest itself in our reality.
So I strongly disagree that the human brain and artificial neural nets have "the underlying principles" to achieve intelligence.
that's why i keep saying it's going to take a fundamental breakthrough to achieve it.
I believe it was whois who has hinted at this when he makes the case that a completely new and different kind of technology must be developed for computers to become more closely human in their "thinking." I'm not sure how far he would take this but while I suggest a substantially different kind of "computer" might more closely imitate the brain, it could never do so substantially. If it could it would not be a machine but a human organ. Recall the story of Frankenstein here.
yes, it was me, and yes, i also agree that man will not come up with this idea but it will be modeled directly on the brain itself.
i believe the best mankind can hope for is the mind/machine interface.
IOW, he will not be able to improve on the brain, but merely extend its capabilities.
once this interface has been achieved, the metaphysical will become a reality.
it will completely change our way of life, and how we think about things.
 
Upvote 0

Justatruthseeker

Newbie
Site Supporter
Jun 4, 2013
10,132
996
Tulsa, OK USA
✟177,504.00
Country
United States
Gender
Male
Faith
Non-Denom
Marital Status
Widowed
Politics
US-Others
A sixty four bit full adder has 64 bits, not two. A processor can do as many calculations simultaneously as it has arithmetic logic units or floating point units. Most likely a modern processor will be making a memory access simultaneously with some operation on data currently held in its registers, or in the L1 cache.

All computers run on binary code. The operating system you see is merely an interface between you and the binary operating system, that actually runs the computer. Otherwise you couldn't type "search" without typing "01110011 01100101 01100001 01110010 01100011 01101000 00001010"

You are confusing the interface that controls memory addresses and simplifies your input with the actual code the computer uses to perform the calculations. There is no language but binary that runs every computer in every home in the world.

https://en.wikipedia.org/wiki/Binary_code

http://study.com/academy/lesson/binary-language-of-computers-definition-lesson-quiz.html

http://csunplugged.org/binary-numbers/
 
Upvote 0

Justatruthseeker

Newbie
Site Supporter
Jun 4, 2013
10,132
996
Tulsa, OK USA
✟177,504.00
Country
United States
Gender
Male
Faith
Non-Denom
Marital Status
Widowed
Politics
US-Others
the discussion seems to be between a simulation of the brain and some type of analogy for DNA/cell machine.
the former is in relation to AI, simulations.
the latter is in regard to a software/ hardware approach to the cell.

there is no way around this, a brain is required to think.
thinking, or your thoughts, do not come from nowhere else but your own mind.
this is probably the major objection to the god scenario, it's simply inconceivable that thoughts can originate without a brain.

i've often thought along similar lines in regards to life itself.
it isn't the molecules that are alive, but the configuration of these molecules allows life to manifest itself in our reality.

that's why i keep saying it's going to take a fundamental breakthrough to achieve it.

yes, it was me, and yes, i also agree that man will not come up with this idea but it will be modeled directly on the brain itself.
i believe the best mankind can hope for is the mind/machine interface.
IOW, he will not be able to improve on the brain, but merely extend its capabilities.
once this interface has been achieved, the metaphysical will become a reality.
it will completely change our way of life, and how we think about things.

Agreed - the "brain" is nothing but the storage device. Consciousness is the energy coursing through the brain. Even in human terms one is not dead until the brain stops transmitting electrical signals. Until then it is possible to restart a heart etc. But once all electrical activity in the brain ceases - there is nothing that can bring you back but God Himself - who is also Consciousness/Energy. Hence our hope that God will "remember" us and the origin of the memorial tomb.
 
Upvote 0

lesliedellow

Member
Sep 20, 2010
9,654
2,582
United Kingdom
Visit site
✟119,577.00
Faith
Calvinist
Marital Status
Single
Politics
UK-Liberal-Democrats
You are confusing the interface that controls memory addresses and simplifies your input with the actual code the computer uses to perform the calculations. There is no language but binary that runs every computer in every home in the world.

I am not confusing anything with anything. Have you ever programmed a computer in assembly language, written a device driver, parsed ACPI tables perhaps, or read PCI Configuration Space? Well I have, and I have no need to cut and paste from Wikipedia.
 
Upvote 0

[serious]

'As we treat the least of our brothers...' RIP GA
Site Supporter
Aug 29, 2006
15,100
1,716
✟95,346.00
Faith
Non-Denom
Marital Status
Married
All computers run on binary code. The operating system you see is merely an interface between you and the binary operating system, that actually runs the computer. Otherwise you couldn't type "search" without typing "01110011 01100101 01100001 01110010 01100011 01101000 00001010"

You are confusing the interface that controls memory addresses and simplifies your input with the actual code the computer uses to perform the calculations. There is no language but binary that runs every computer in every home in the world.

https://en.wikipedia.org/wiki/Binary_code

http://study.com/academy/lesson/binary-language-of-computers-definition-lesson-quiz.html

http://csunplugged.org/binary-numbers/
Computers run on binary in the same way the brain does. Each individual logic gate either sends a signal or doesn't. Likewise, each neuron either sends a signal or doesn't. Both computers and brains can process multiple binary streams. Computers process each stream faster, while brain can process more in parallel. This gives each am edge in certain tasks, much like gpus and cpus are better at different talks.
 
Upvote 0

Justatruthseeker

Newbie
Site Supporter
Jun 4, 2013
10,132
996
Tulsa, OK USA
✟177,504.00
Country
United States
Gender
Male
Faith
Non-Denom
Marital Status
Widowed
Politics
US-Others
Computers run on binary in the same way the brain does. Each individual logic gate either sends a signal or doesn't. Likewise, each neuron either sends a signal or doesn't. Both computers and brains can process multiple binary streams. Computers process each stream faster, while brain can process more in parallel. This gives each am edge in certain tasks, much like gpus and cpus are better at different talks.

The difference is computers are at this time limited to 64 bit data buses - information pathways - while the brain has billions of neural pathways for the transference of data. Which is why you can make leaps of logic and think and the computer can not. If you have the latest computer it has at the most 8 processors working in tandem - the brain billions. The computer is faster at certain tasks because those tasks require less data. Smiling uses more calculations to control muscles than a computer processes to calculate pie to the millionth digit. Because the computer is focused solely on that one calculation - while you are thinking about multiple things - processing billions of bits of data - while in the process of smiling. Even while focused on one specific task - your brain is continuing to process all external stimuli - taking in billions of bits of data and processing them in the blink of an eye. So which is really faster? How fast would your brain really work if it processed no other stimuli while it focused on one specific task? You may think you are focusing on the task at hand - but in the background it is continuing to process more data than you could possibly dream.

Now flood that computer with external stimuli of billions of bits of data unrelated to the task at hand - but still needing processed and let's all watch it slow to a crawl.
 
Last edited:
Upvote 0