Welcome to City-Data.com Forum!
U.S. CitiesCity-Data Forum Index
Go Back   City-Data Forum > General Forums > Religion and Spirituality > Atheism and Agnosticism
 [Register]
Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
View detailed profile (Advanced) or search
site with Google Custom Search

Search Forums  (Advanced)
Reply Start New Thread
 
Old 08-09-2014, 11:55 AM
 
12,918 posts, read 16,900,789 times
Reputation: 5434

Advertisements

This is one of the strangest ideas I have heard atheists state. That human consciousness is no different than something which could evolve into being inside a man-made machine or computer.

Do some really believe that a machine can have conscious feelings?
Reply With Quote Quick reply to this message

 
Old 08-09-2014, 11:57 AM
 
18,553 posts, read 15,639,349 times
Reputation: 16250
Quote:
Originally Posted by OzzyRules View Post
This is one of the strangest ideas I have heard atheists state. That human consciousness is no different than something which could evolve into being inside a man-made machine or computer.

Do some really believe that a machine can have conscious feelings?
In principle, yes. That is not to say that any such technology will be around in the near future, though.
Reply With Quote Quick reply to this message
 
Old 08-09-2014, 12:05 PM
 
Location: S. Wales.
50,089 posts, read 20,814,520 times
Reputation: 5931
It seems hard to imagine, but the more complex and sophisticated a machine mentality becomes, the more likely it is that it could have what we think of as feelings and emotions.
Reply With Quote Quick reply to this message
 
Old 08-09-2014, 12:46 PM
 
Location: Northeastern US
20,104 posts, read 13,564,519 times
Reputation: 9995
Quote:
Originally Posted by OzzyRules View Post
This is one of the strangest ideas I have heard atheists state. That human consciousness is no different than something which could evolve into being inside a man-made machine or computer.

Do some really believe that a machine can have conscious feelings?
As a software developer with 30 years in the business and a hobbyist interest in AI, I believe this will eventually happen. Self-awareness is simply a feedback loop, something we have a lot of experience with and that already works well in trial-by-error "self learning" in applications such as voice recognition and expert systems. Recently an open source library that simulates the human prefrontal cortex for purposes of recognizing patterns in data streams, has been released. We are learning that the human brain is not at all like a digital computer, it is a pattern matching engine. It is optimized to notice things that don't fit observed patterns, more specifically -- which is, of course, generally speaking, a survival benefit.

When I say it will "eventually" happen that does not mean I think it is "nigh, even at the door". It could be the better part of a century before an AI develops even rudimentary self-awareness. It is a tremendously complex problem. However when it does happen I suspect it will burst on the scene rather suddenly, as it will just be the intersection of sufficient understanding of how natural consciousness and self awareness works, of sufficiently powerful computing hardware with the proper architecture, and properly crafted software. It isn't inconceivable that a near-future scenario like the recent movie, Her, depicts, could happen. I just doubt it, for a variety of reasons. Not the least of which is that if we have to fiddle so much with our computers just to use them for conventional purposes I don't see how they would be very good vehicles at this point for an AI.

It is hardly an "atheist" idea, although I myself am an atheist and I suppose that we are less biased against the idea and less suspicious of mechanistic / materialistic explanations for even our more refined and esoteric perceptions.

I will say this, even as an atheist, though. It is entirely possible that our AI research will end up demonstrating that there is some totally unreproducible aspect of consciousness and self awareness, that it is neither at bottom grounded in the material world nor an emergent property thereof. I don't think this is likely but I don't deny it is a possibility. If I'm proven wrong I will happily embrace such information so long as it's not just something made up as a placeholder for ignorance. Because I also believe that not all ignorance can be remedied. We are finite beings with finite capabilities and some things may be ever beyond us. That is proof of only one thing for sure: that some things are outside of our personal scope.
Reply With Quote Quick reply to this message
 
Old 08-09-2014, 01:02 PM
 
Location: Parts Unknown, Northern California
48,564 posts, read 24,199,387 times
Reputation: 21239
Consciousness could theoretically be reached, as in the machine becoming aware of itself and self interested. It wouldn't be "human" consciousness unless we built some machine with simulated hormones which inflame passions and subject the machine to human dynamics like jealously, greed, compassion, desire for recreation, aspirations for a better life etc.

Even if we had the capacity to build such a machine, would it be a good idea? Would we want an artificial human with its inherent unpredictability, or would we want a Star Trek Next Generation style AI being such as Mr. Data?
Reply With Quote Quick reply to this message
 
Old 08-09-2014, 01:44 PM
 
Location: Northeastern US
20,104 posts, read 13,564,519 times
Reputation: 9995
Quote:
Originally Posted by Grandstander View Post
Consciousness could theoretically be reached, as in the machine becoming aware of itself and self interested. It wouldn't be "human" consciousness unless we built some machine with simulated hormones which inflame passions and subject the machine to human dynamics like jealously, greed, compassion, desire for recreation, aspirations for a better life etc.

Even if we had the capacity to build such a machine, would it be a good idea? Would we want an artificial human with its inherent unpredictability, or would we want a Star Trek Next Generation style AI being such as Mr. Data?
It is an interesting question whether emotions are just inherent in consciousness or are some sort of bolt-on like Mr. Data's "emotion chip".

My feeling is that emotion is just feedback mechanisms in action -- and when those are in some sort of general homeostasis we call that "well adjusted" or "sane". If an organism has a survival need, it will be expressed as anxiety and fear when danger presents itself, pleasure and contentment when there are no immediate threats. We tend to experience these almost like complex "things" in their own right, but I suspect they are nothing more than biochemical reactions to various stimuli. So Mr. Data's emotions would not come from an add-on chip, likely, but would be a subjective self aware response to perceived threats / non-threats ... if he had a relationship between awareness and some kind of sympathetic / parasympathetic nervous system like ours, emotions would be what he would call the subjective experience of those things interacting.

To the point of your question, would this be a good idea ... at this point, who knows? It would open a whole new field of moral investigation. If I could create a true AI in the 8 megabytes of memory on my MacBook Air right now, the moment it became self-aware, would it be immoral to shut that simulation down? Would it beg me not to shut it down -- or would it perhaps beg me TO shut it down?

Furthermore, early prototypes may not get the balance of factors correct, and may be subject to various forms of instability, madness, and personality disorders -- probably, initially, to a greater degree than humans are with our imperfect, fragile, overheated brains. Is it immoral to bring such experimental subjects into existence, so that they can suffer, unasked for? Or should we get their permission first?
Reply With Quote Quick reply to this message
 
Old 08-09-2014, 01:49 PM
 
888 posts, read 455,557 times
Reputation: 468
Were they to be wifi enabled hormones that would give different robots the ability to feed off each others' dynamics, that could be a real mess, especially when one thinks about the different ways the individual hormones levels, not to mention cyclical variations, could be programmed. Talk about getting into someone's head... It would also be a way to really blur gender differences.
Reply With Quote Quick reply to this message
 
Old 08-09-2014, 01:59 PM
 
Location: Northeastern US
20,104 posts, read 13,564,519 times
Reputation: 9995
Of course an "infant" AI would probably need an amount of time to explore and experiment and acquire the ability to relate, just like a human infant, and would probably not develop properly without some form of "parenting". And/or, it might be quite alien in its needs compared to us, and we'd be totally inadequate "parents".

It will be an interesting ride, if it happens. I think it will, in some form.
Reply With Quote Quick reply to this message
 
Old 08-09-2014, 02:12 PM
 
888 posts, read 455,557 times
Reputation: 468
Quote:
Originally Posted by mordant View Post
It is an interesting question whether emotions are just inherent in consciousness or are some sort of bolt-on like Mr. Data's "emotion chip".

My feeling is that emotion is just feedback mechanisms in action -- and when those are in some sort of general homeostasis we call that "well adjusted" or "sane". If an organism has a survival need, it will be expressed as anxiety and fear when danger presents itself, pleasure and contentment when there are no immediate threats. We tend to experience these almost like complex "things" in their own right, but I suspect they are nothing more than biochemical reactions to various stimuli. So Mr. Data's emotions would not come from an add-on chip, likely, but would be a subjective self aware response to perceived threats / non-threats ... if he had a relationship between awareness and some kind of sympathetic / parasympathetic nervous system like ours, emotions would be what he would call the subjective experience of those things interacting.

To the point of your question, would this be a good idea ... at this point, who knows? It would open a whole new field of moral investigation. If I could create a true AI in the 8 megabytes of memory on my MacBook Air right now, the moment it became self-aware, would it be immoral to shut that simulation down? Would it beg me not to shut it down -- or would it perhaps beg me TO shut it down?

Furthermore, early prototypes may not get the balance of factors correct, and may be subject to various forms of instability, madness, and personality disorders -- probably, initially, to a greater degree than humans are with our imperfect, fragile, overheated brains. Is it immoral to bring such experimental subjects into existence, so that they can suffer, unasked for? Or should we get their permission first?
I read some of your posts about AI in another recent thread, along with this thread, and find them very interesting. Since I don't know much about AI, I appreciate your sharing your expertise.

The area of intelligence I speculate most on these days in my own private musings is music and how people emotionally feed off each other through music. Based on my own personal experience, not anything I've read, I think there are differences in the cognitive processes used when a group of people experience live music together (especially singing) compared to an individual and/or group experience with recorded music.

Singing with a group triggers something in me unlike anything else. I wonder if there is something cognitive going on when people sing together that triggers something in the brain, be it emotional, hormonal, stress reduction, etc., or various combinations thereof, that goes beyond what is possible for the brain to experience when one is alone or through recorded music. I liken it to musical pheromones and wonder if future research will discover pheromones that are activated by music.
Reply With Quote Quick reply to this message
 
Old 08-09-2014, 05:23 PM
 
Location: Parts Unknown, Northern California
48,564 posts, read 24,199,387 times
Reputation: 21239
Quote:
Originally Posted by mordant View Post

Furthermore, early prototypes may not get the balance of factors correct, and may be subject to various forms of instability, madness, and personality disorders -- probably, initially, to a greater degree than humans are with our imperfect, fragile, overheated brains
That was one of the premises of "2010: The Year We Make Contact"...the sequel to "2001: A Space Odyssey." They find and board the abandoned Jupiter mission ship and discover what went wrong with HAL, what turned it into a killer.

The explanation was that HAL had the equivalent of a paranoid nervous breakdown because of a conflict in its orders. The NSC had made HAL aware of the discovery of the monolith on the moon, thus HAL was aware of the true purpose of the mission. The astronauts had not been informed. HAL had been programmed to always deliver honest answers to humans, and was unable to fully process the instructions to conceal his knowledge from the crew members. The inability to resolve this caused HAL's mental breakdown.
Reply With Quote Quick reply to this message
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.

Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.


Reply
Please update this thread with any new information or opinions. This open thread is still read by thousands of people, so we encourage all additional points of view.

Quick Reply
Message:

Over $104,000 in prizes was already given out to active posters on our forum and additional giveaways are planned!

Go Back   City-Data Forum > General Forums > Religion and Spirituality > Atheism and Agnosticism
Similar Threads

All times are GMT -6.

© 2005-2024, Advameg, Inc. · Please obey Forum Rules · Terms of Use and Privacy Policy · Bug Bounty

City-Data.com - Contact Us - Archive 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37 - Top