• Starting today August 7th, 2024, in order to post in the Married Couples, Courting Couples, or Singles forums, you will not be allowed to post if you have your Marital status designated as private. Announcements will be made in the respective forums as well but please note that if yours is currently listed as Private, you will need to submit a ticket in the Support Area to have yours changed.

My view on souls

Status
Not open for further replies.

ragarth

Well-Known Member
Nov 27, 2008
1,217
62
Virginia, USA
✟1,704.00
Faith
Humanist
Marital Status
Single
How nice.
Those of us who done any programming at all will recognize a problem with this line of thinking.
A self-awareness sub-routine will need to loop, creating a series of "me" in non-continuous but sequential moments of time. And to approach the level of awareness we appear to have; that “the me” this second is the same me from several seconds ago, would require a very tight loop.

In a very practical sense however these are not the same "me"s but a just series of similar calculations. Does that appear to me to explain my perception of "me"? I find it is interesting but unsatisfactory.

Not to mention programming loops tend to do bad things to computers, especially tight loops.

No, while an interesting notion, I do not see how processing loops can explain our sense of self awareness. It goes beyond just a simple status query to the point that there is a very real perception that this “me”, while part of this body, is something more.

How nice
Those of us who have done any research into neuroscience at all know that the human brain is not a computer, that a comparison between the two is fallacious.

Before I start, how about I give you some credentials? I'm a computer science major studying to go into computational neuroscience. I program in python, java, c++, and emergent. I like python the most, however, because it's quick and easy. I'm still learning emergent. ATM I specifically work with hopfield neural networks because of their simplicity as a simple-summation perceptron, and the single layer makes it easy to experiment with some learning algorithms.

My major project right now is to produce a mathematical model of a neuron that models neurotransmitter decay and it's effect upon impulse frequency. My hypothesis is that a single neuron can perform complex calculations and encode the results via frequency modulation using the interplay of neurotransmitter decay rates and impulse times.

Now, to you're poor metaphor of the brain as a computer. Since you have a basis in how a computer works, then you clearly know that a computer is a serial processing device. We do some tricks that modify this- multi-core processors, hyperthreading, multiple cpu's, pipeline streamlining, etc- but they are still effectively serial in comparison to the sheer number of operations performed vs the number capable of being performed at the same time. Further, no matter how many cores the system has, no computer has multiple bussess, ergo it has a distinct bottleneck.

Hardware neuralnetworks in comparison are very different, every node(neuron) is a single processing unit of very low complexity, in the FACETS program a single node can be simulated with no more than 100 components. This makes a neural network massively parallel- a 100 node neural network can perform 100 operations at one time, a 10billion node neural network (the human brain) can perform 10billion operations at one time (this is a little innacurate, the state of an operation in a neural network is different than in a classical computer- but this goes into some pretty heavy theory to explain). It is important to note, however, that if all the neurons in your brain fired at once you'd have a seizure. :) There is no standard clock in neural networks, no need for one, and so a neural network is asynchronous in nature- I hope I can assume you now that asynchronous studies with serial processors have had mixed results?

There is no central bus in a neural network- every node connects to those it needs to (in a hopfield every node connects to every node), and the complex-connection model (synapses varying weights) models the memorative ability of biological neural networks showing that no means of connection to a separate memory device (ram or hd) is necessary, making the only real bottleneck the travel-time of data down a connection, and of course the speed of the nodes.

Because of the serial nature of classical computing, you are right that a normal processor is non-continuous. While there is some overlapping in informational states within it, the structure of a modern computer does not provide a continuous flow from one dataset to the other. However, because of the parallel and asynchronous nature of neural networks, it is possible for there to be a continuous logical state from one moment to the next, indeed it's hypothetically possible for a single neural network to maintain multiple continuous logical states at the same time. :p

What's all this mean? Your brain is not a computer program, comparing your brain to a computer program is about as silly as me comparing my cellphone to a yummy cheesecake. They are each good at different things, they each have different limitations. They are both very different computational devices.

I do take offense at your assumption as to the state of my knowledge. What are your credentials?

N.B. As stated in my last post, I have no time to go into showing you why your apparant idea of what programming self-awareness in a classical computing paradigm is woefully ill-informed, everything in this post is just off the top of my head, no external research. Perhaps you should do some research into the state of facial recognition software, since recognizing one's own image is similar (in principle) to recognizing someone else's image. This would meet the minimal test criteria of it being able to recognize itself in a mirror.

-------------

I'm sorry, I've just gotta say, I'm really amused by your idea of loops being bad for a computer. Perhaps you mean infinite loops? I've been setting here puzzling over your apparently exotic idea of what a loop is- loops are used constantly in programs:

Code:
while True:
    s = raw_input('Enter something: ')
    if s == 'quit':
        break
    print 'Length of the string is', len(s)
print 'Done'
Is this bad for a computer?

//Just reread your statement, you do indeed say tight loop! but only once, you're still characterizing loops as being bad for computers- something I find incredibly amusing. Also, a tight loop as you should know is a programming error (same as an infinite loop), there's nothing in your statement that even hints that any 'self awareness' program would produce a tight loop, you just presume this out of the aether.

My programming example would become a 'tight loop' if I removed the "if s == 'quit': break" part. So in essence, our handy dandy tight loop self-awareness program could cease being a tight-loop by putting in an escape routine. This programming error has nothing to do with the capacity of a program to perform a given function, you're assumption of it being in a self-awareness program goes beyond a straw-man and into the realm of just bizarre.//

I meant to make this post short. :-( I think I failed- neural networks are a bit of an obsession of mine, I apologize for typing your ear off about them.
 
Last edited:
Upvote 0

DrBubbaLove

Roman Catholic convert from Southern Baptist
Site Supporter
Aug 8, 2004
11,336
1,728
65
Left coast
✟100,100.00
Gender
Male
Faith
Catholic
Marital Status
Married
Politics
US-Others
I was not precise because it is not necessary for this discussion, but glad you are deep in studies. I do think the comparison is helpful because these posts were implying it should be rather simple to "upload" "me" into some storage medium and later reproduce "me" in other body. Nor did I mean to poke fun at your knowledge or question your logic in regards to computers.

Tight loops can "tend" to do "bad" things I believe was what I said and to an extent that is true. It takes more time for one thing and yes parallel processing allows for even infinite "periods" of time, but last time I checked our brains are no where near infinite and people have survived massive brain trauma without significant changes in who they are.

The problem as I see it even with parallel systems, even many parallel systems, is that what is uniquely "me" tends to both get lost in the calculations as well as there being not much difference then between what is "me" and or what makes me "me" as opposed to another brain. I can be thinking of many things at the same time, but there is never a loss of it being "me" doing all that thinking.

Theoretically if parallel processing could explain "me" then it seems to me that more of us should be very much alike. I have failed to find that true in my life and that is speaking from practical experience rather than detailed knowledge of how computers or brains work.
 
Upvote 0

ragarth

Well-Known Member
Nov 27, 2008
1,217
62
Virginia, USA
✟1,704.00
Faith
Humanist
Marital Status
Single
I was not precise because it is not necessary for this discussion, but glad you are deep in studies. I do think the comparison is helpful because these posts were implying it should be rather simple to "upload" "me" into some storage medium and later reproduce "me" in other body. Nor did I mean to poke fun at your knowledge or question your logic in regards to computers.

Tight loops can "tend" to do "bad" things I believe was what I said and to an extent that is true. It takes more time for one thing and yes parallel processing allows for even infinite "periods" of time, but last time I checked our brains are no where near infinite and people have survived massive brain trauma without significant changes in who they are.

The problem as I see it even with parallel systems, even many parallel systems, is that what is uniquely "me" tends to both get lost in the calculations as well as there being not much difference then between what is "me" and or what makes me "me" as opposed to another brain. I can be thinking of many things at the same time, but there is never a loss of it being "me" doing all that thinking.

Theoretically if parallel processing could explain "me" then it seems to me that more of us should be very much alike. I have failed to find that true in my life and that is speaking from practical experience rather than detailed knowledge of how computers or brains work.

Righto, sorry if I gave the impression that I was talking about running a simulated consciousness on a classical computer. It's my opinion that we will only really achieve AI via either neural network coprocessors (something I'm interested in experimenting with!) or via 'hard' AI wherein the code is not representative of an nn model. I like nn-based AI, but I do recognize it's severe limitations in classical computing. Technically speaking though, a tight loop is always a bad thing. :D

Beyond that I'll let you have the last word, I really do need to buckle down on my studies. I have no free time anymore due to college/work and the repercussions of my mom's accident (There's now a single working car to get three people to their jobs, separated by 45 minutes travel in 1 direction, all at about the same time). So it's been a pleasure, I love talking about neuroscience and computers- It really is an obsession!
 
Upvote 0

Chesterton

Whats So Funny bout Peace Love and Understanding
Site Supporter
May 24, 2008
26,224
21,439
Flatland
✟1,082,109.00
Faith
Eastern Orthodox
Marital Status
Single
Perhaps you should do some research into the state of facial recognition software, since recognizing one's own image is similar (in principle) to recognizing someone else's image.

(In principle) an "altruistic" bacterium is similar to Sydney Carton.

IMHO, to save potentially wasted effort, prove the science first, then build a philosophy upon it. Jesus told parables about men who build on shaky foundations. ;)
 
Upvote 0
Status
Not open for further replies.