Quote:
Originally Posted by Oakback
I believe I watched a documentary ( if memory serves me, Elon Musk had some input ) where a computer generated "person", that was controlled by AI, would have sophisticated conversations with different participants. The experiment was to see how many people where aware they were talking to a robot. Only about half realized the "person" on the screen was computer generated.
The image they were talking to was scary real.
|
Yes there is a web site you can go to and talk with a virtual computer-generated "friend" and you can build out the facial features and hair style and so forth, decide if its male or female, and probably by now can play with aspects of its personality. It learns about you through conversation and does a pretty good job of pulling off the whole thing I understand. I believe it's one of those free things with an optional paid premium version.
Musk himself is about to show a prototype of a humanoid robot. His initial motivation is to deal with a so-called "labor shortage" in his factories. Like most such "shortages" it is a function of paying employees too little and treating them too poorly. So he's explicitly building slave laborers -- doubtless, the fantasy of captains of industry everywhere. In this particular case I think he's bitten off way more than he can chew. Just getting a non-specialized humanoid robot to properly hold and use a wrench is a very difficult problem with current technology.
He's not successfully pulling off his long-promised autonomous / "fully self-driving" cars, either. He's found enough dim bulbs to pay him $10K (now $15K I believe) extra for the half-baked version available now, on he gamble that the "full" version will be working Any Day Now. No matter how hard they try, the software is still confused by novel situations. As a result, they can handle freeways and most highways pretty well but they still demand you pay attention to the road and be ready to take over driving at any moment, and many cityscapes are too much for it -- so what's the point.
The inside joke is that if one of these systems can not be confused when it sees an ostrich in the road, we'll know there's actual reasoning going on, instead of just really expensive pattern-matching. Until then I'm not interested in trusting my road safety to a computer, thankyouverymuch!