Log in
Register
Search
Search titles only
By:
Search titles only
By:
Forums
New posts
Forum list
Search forums
Leaderboards
Games
Our Blog
Blogs
New entries
New comments
Blog list
Search blogs
Credits
Transactions
Shop
Blessings: ✟0.00
Tickets
Open new ticket
Watched
Donate
Log in
Register
Search
Search titles only
By:
Search titles only
By:
More options
Toggle width
Share this page
Share this page
Share
Reddit
Pinterest
Tumblr
WhatsApp
Email
Share
Link
Menu
Install the app
Install
Forums
Discussion and Debate
Discussion and Debate
Ethics & Morality
Where does morality come from?
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="stevevw" data-source="post: 74977587" data-attributes="member: 342064"><p><span style="color: rgb(0, 0, 0)"> No it's not. The Trolley problem specifically says you see a trolley coming along a track and it's going to run down 5 people. Do you change the track so that it misses those 5 people and hits 1 person instead? So not changing the track you are saying you will allow the trolley to hit the 5 people. That is the idea of the Trolley thought experiment to see if someone will take action to avoid something worse happening. Therefore in the Trolly problem there are only 2 choices. What I am saying is that as you and I have indicated that choosing to avoid mass murder and more misery, loss is the right option. That's because it is always the right option. Any other option is objectively wrong regardless of what a person's subjective opinion is. </span></p><p><span style="color: rgb(0, 0, 0)"> </span></p><p> <span style="color: rgb(0, 0, 0)">First the trolley problem does not specify that. You are adding further criteria to the equation. Then you open the door to add other stuff when you were insistent that I could not change anything in the trolley problem like there were brakes on the trolley or I could beep the trolley horn to warn the people. Otherwise, I could say what if the 5-year old was had a severe disability or was a diagnosed psycho maniac. </span></p><p><span style="color: rgb(0, 0, 0)"></span></p><p><span style="color: rgb(0, 0, 0)">But the other factor about why it is right to kill 1 rather than 5 is that you are multiplying the killing and doing mass murder (more of a bad thing) is always seen as a greater wrong regardless of the circumstances of the age and background. </span></p><p><span style="color: rgb(0, 0, 0)"></span></p><p> I think we have misunderstood each other. I was stating that two people who both support objective morality IE two Christians will come to the same objective moral position. I wasn't trying to prove objective morality but to show how people who support objective morality will have the same morals.</p><p></p><p> Actually I think you are not understanding the true implications of subjective views when it comes to morality. If subjective morality is similar to a persons "likes and dislikes" for food for example (choc icecream v vanilla) then how is there a distinction.</p><p></p><p>Applied to morality it doesn't matter if one chooses choc as opposed to vanilla (moral view A or B) as "likes and dislikes" don't say anything about what is correct or right. You cannot say that your moral option (personal like) is more correct than someone else no more than you can say you're like for Vanilla icecream is correct as opposed to someone else's like for choc. So whether applied to like or dislikes or morality the subjective choices are equal in the overall scheme of things IE ultimately. </p><p></p><p> Why is it scary.</p><p></p><p> Therefore it is not the same as the Trolley Problem. The Trolley problem only allows for the person operating the trolley to make the decision to switch tracks not some detached person who cannot see what is happening.</p><p></p><p> Yes, the same person who designed the brakes which will stop the car and avoid knocking down the pedestrian faster than any human reaction.</p><p></p><p>But if you read the article I linked it states that machines are not human and cannot think like humans as far as how other drivers and pedestrians act no matter how you program them. That is what the experts are saying is why it is a wrong comparison to the Trolley problem because the Trolley problem is about the person at the scene (a human) seeing what is happening and changing the tracks accordingly. IE</p><p></p><p><em><span style="color: #00b3b3">The Trolley Dilemma has also been applied to autonomous vehicles, since in the face of a potential accident, the software may be required to decide between several courses of action.</span><strong> <span style="color: #b300b3">That’s despite the fact that it’s considered an extremely flawed way to think about a complicated problem by prominent ethicists and researchers. </span></strong></em></p><p></p><p>Humans are not robots and can anticipate unpredictable situations, do other people's thinking to anticipate something happening. A robot (automated car) can never be programmed to do that, therefore, using it in a trolley thought experiment is as they say a terrible model for the Trolley problem.</p><p></p><p> Now you're adding new additions to the Trolley problem which you said we cant do. You said the people were workers on the Trolley lines. If they are just workers then they would be more aware of looking for something coming. After all, they are working on a trolley line where Trolley comes along.</p><p></p><p> Though I gave the objective moral position for it my objection was about having to consider such an unreal situation in the first place. Because if all options were available then the objectively right thing to do would be what most people would do and that is to beep the horn and yell to the workers or yell and beep to get someone else to warn and get the workers off the track and avoid killing anyone at all.</p><p></p><p>But you have made the trolley problem that way and taken away those real-life options. What the ethical experts were complaining about was how unreal the situation was with all the examples. You are denying human agency which is capable of extraordinary things when trying to save lives in a crisis situation and ironically are making humans robots.</p><p></p><p> But then that would not be the Trolley problem as a small amount of damage is different from killing someone. It is different because the Trolley problem only has tow options kill 5 or kill 1 in hitting a switch. A car heading towards a pedestrian who they may kill and swerves away from that into an unknown there is no definite killing for that option thus the moral dilemma is different.</p><p></p><p>Most people would probably hit the brakes but this is not allowed or you saying the person must die anyway. So then most people will swerve away thinking that they will avoid killing the pedestrian. But unlike the Trolley problem, they will not know or think they are definitely going to kill someone else as a result.</p><p></p><p> They cannot be held responsible because they are not driving the car. That may be a lawsuit but for morality it is different. The programmer has not intended to kill anyone, they are not actively involved in killing anyone at the scene. They have programmed the car to do everything they can do to act like a human if not better. At worst they may be indirectly culpable but moral wise they have not committed any intentional act. But applying all this to your situation where a person darts out from between parked cars illegally then the fault will mostly lie with the pedestrian.</p><p></p><p> Your assuming a lot. The 5-year-old may be a cripple who is suffering every day of their life. They may drown in an accident the next day. The 5 ninety-year-olds may be nuns who are helping thousands of 5-year-olds to have better happier lives thus prolonging their lives. They may be passing their knowledge onto others so that it continues a legacy. They may be grandmothers of 5-year-olds who play an integral part in a larger family unit that thrives with them there.</p><p></p><p>Your only looking at the quantity and not quality and that's a bad way to determine things. As we will never know all these factors it is a risky general rule to apply. I think killing a number of people as opposed to 1 is wrong and has more risks associated. It is wrong to multiply the act of killing regardless of age. If the crime of murder more wrong whether its an old person or a young person morally. Do they get a longer sentence under the law?</p><p></p><p> We don't know as the trolley problem doesn't allow for any consideration of these factory brakes. You've got to assume that somehow the brakes were not working or there were no brakes. Otherwise, if there were brakes we could perhaps look at someone else not ensuring the brakes were working properly or sabotage from a 3rd party.</p><p></p><p>That's why it's unreal. It takes away human agency and disallows what any normal human would do which is "Do absolutely everything within their power to avoid killing anyone. When someone is able to do that it diminishes their moral accountability as they at least tried.</p><p></p><p> But unlike the trolley problem, it changes the fact that he knows his decision it will not definitely result in the death of someone. Therefore diminished responsibility IE accidental harm or death and not an intentional killing. For morality that's a big difference as it is about intentions.</p></blockquote><p></p>
[QUOTE="stevevw, post: 74977587, member: 342064"] [COLOR=rgb(0, 0, 0)] No it's not. The Trolley problem specifically says you see a trolley coming along a track and it's going to run down 5 people. Do you change the track so that it misses those 5 people and hits 1 person instead? So not changing the track you are saying you will allow the trolley to hit the 5 people. That is the idea of the Trolley thought experiment to see if someone will take action to avoid something worse happening. Therefore in the Trolly problem there are only 2 choices. What I am saying is that as you and I have indicated that choosing to avoid mass murder and more misery, loss is the right option. That's because it is always the right option. Any other option is objectively wrong regardless of what a person's subjective opinion is. First the trolley problem does not specify that. You are adding further criteria to the equation. Then you open the door to add other stuff when you were insistent that I could not change anything in the trolley problem like there were brakes on the trolley or I could beep the trolley horn to warn the people. Otherwise, I could say what if the 5-year old was had a severe disability or was a diagnosed psycho maniac. But the other factor about why it is right to kill 1 rather than 5 is that you are multiplying the killing and doing mass murder (more of a bad thing) is always seen as a greater wrong regardless of the circumstances of the age and background. [/COLOR] I think we have misunderstood each other. I was stating that two people who both support objective morality IE two Christians will come to the same objective moral position. I wasn't trying to prove objective morality but to show how people who support objective morality will have the same morals. Actually I think you are not understanding the true implications of subjective views when it comes to morality. If subjective morality is similar to a persons "likes and dislikes" for food for example (choc icecream v vanilla) then how is there a distinction. Applied to morality it doesn't matter if one chooses choc as opposed to vanilla (moral view A or B) as "likes and dislikes" don't say anything about what is correct or right. You cannot say that your moral option (personal like) is more correct than someone else no more than you can say you're like for Vanilla icecream is correct as opposed to someone else's like for choc. So whether applied to like or dislikes or morality the subjective choices are equal in the overall scheme of things IE ultimately. Why is it scary. Therefore it is not the same as the Trolley Problem. The Trolley problem only allows for the person operating the trolley to make the decision to switch tracks not some detached person who cannot see what is happening. Yes, the same person who designed the brakes which will stop the car and avoid knocking down the pedestrian faster than any human reaction. But if you read the article I linked it states that machines are not human and cannot think like humans as far as how other drivers and pedestrians act no matter how you program them. That is what the experts are saying is why it is a wrong comparison to the Trolley problem because the Trolley problem is about the person at the scene (a human) seeing what is happening and changing the tracks accordingly. IE [I][COLOR=#00b3b3]The Trolley Dilemma has also been applied to autonomous vehicles, since in the face of a potential accident, the software may be required to decide between several courses of action.[/COLOR][B] [COLOR=#b300b3]That’s despite the fact that it’s considered an extremely flawed way to think about a complicated problem by prominent ethicists and researchers. [/COLOR][/B][/I] Humans are not robots and can anticipate unpredictable situations, do other people's thinking to anticipate something happening. A robot (automated car) can never be programmed to do that, therefore, using it in a trolley thought experiment is as they say a terrible model for the Trolley problem. Now you're adding new additions to the Trolley problem which you said we cant do. You said the people were workers on the Trolley lines. If they are just workers then they would be more aware of looking for something coming. After all, they are working on a trolley line where Trolley comes along. Though I gave the objective moral position for it my objection was about having to consider such an unreal situation in the first place. Because if all options were available then the objectively right thing to do would be what most people would do and that is to beep the horn and yell to the workers or yell and beep to get someone else to warn and get the workers off the track and avoid killing anyone at all. But you have made the trolley problem that way and taken away those real-life options. What the ethical experts were complaining about was how unreal the situation was with all the examples. You are denying human agency which is capable of extraordinary things when trying to save lives in a crisis situation and ironically are making humans robots. But then that would not be the Trolley problem as a small amount of damage is different from killing someone. It is different because the Trolley problem only has tow options kill 5 or kill 1 in hitting a switch. A car heading towards a pedestrian who they may kill and swerves away from that into an unknown there is no definite killing for that option thus the moral dilemma is different. Most people would probably hit the brakes but this is not allowed or you saying the person must die anyway. So then most people will swerve away thinking that they will avoid killing the pedestrian. But unlike the Trolley problem, they will not know or think they are definitely going to kill someone else as a result. They cannot be held responsible because they are not driving the car. That may be a lawsuit but for morality it is different. The programmer has not intended to kill anyone, they are not actively involved in killing anyone at the scene. They have programmed the car to do everything they can do to act like a human if not better. At worst they may be indirectly culpable but moral wise they have not committed any intentional act. But applying all this to your situation where a person darts out from between parked cars illegally then the fault will mostly lie with the pedestrian. Your assuming a lot. The 5-year-old may be a cripple who is suffering every day of their life. They may drown in an accident the next day. The 5 ninety-year-olds may be nuns who are helping thousands of 5-year-olds to have better happier lives thus prolonging their lives. They may be passing their knowledge onto others so that it continues a legacy. They may be grandmothers of 5-year-olds who play an integral part in a larger family unit that thrives with them there. Your only looking at the quantity and not quality and that's a bad way to determine things. As we will never know all these factors it is a risky general rule to apply. I think killing a number of people as opposed to 1 is wrong and has more risks associated. It is wrong to multiply the act of killing regardless of age. If the crime of murder more wrong whether its an old person or a young person morally. Do they get a longer sentence under the law? We don't know as the trolley problem doesn't allow for any consideration of these factory brakes. You've got to assume that somehow the brakes were not working or there were no brakes. Otherwise, if there were brakes we could perhaps look at someone else not ensuring the brakes were working properly or sabotage from a 3rd party. That's why it's unreal. It takes away human agency and disallows what any normal human would do which is "Do absolutely everything within their power to avoid killing anyone. When someone is able to do that it diminishes their moral accountability as they at least tried. But unlike the trolley problem, it changes the fact that he knows his decision it will not definitely result in the death of someone. Therefore diminished responsibility IE accidental harm or death and not an intentional killing. For morality that's a big difference as it is about intentions. [/QUOTE]
Insert quotes…
Verification
Post reply
Forums
Discussion and Debate
Discussion and Debate
Ethics & Morality
Where does morality come from?
Top
Bottom