Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
As this thread continues I'd like to remind everybody to stay calm and respectful. Remember this is a Science forum on a real estate website, it's not a place for scientists and peer review of their colleagues thesis. . Discussions like this have a place here and will not be moved. Even if they're closer to science fiction than to actual science They are still interesting and provide us, the readers, with a ton of interesting links and theories.
Yac.
While I didn't read the it, I have no doubt that I would agree with most everything written in the article. However, it was a special report. Special reports are not peer reviewed, but generally, they represent a consensus of current thought.
I will read it, but I doubt that Kurzweil included some of his more, shall we say, controversial, ideals in the submission.
I will read it, as well.
By the way…what could make my mom less afraid of virtual reality, transhumanism, and technology? I don't want her to think hellgate has opened…
As this thread continues I'd like to remind everybody to stay calm and respectful. Remember this is a Science forum on a real estate website, it's not a place for scientists and peer review of their colleagues thesis. . Discussions like this have a place here and will not be moved. Even if they're closer to science fiction than to actual science They are still interesting and provide us, the readers, with a ton of interesting links and theories.
Yac.
I will say that the biggest hurdle that will determine if the singularity is science fact, at least in our lifetime, or becomes science fiction will be around 2020-2022 when the current paradigm, the intragrated circuit, runs out of steam. No one disagrees that will happen. The disagreement comes with what happens next.
There are two camps:
Camp A
Thinks we will have a smooth transition to the next paradigm, 3D self organizing molicular structures. They cite the history of computers and show that we have gone through a few paradigms since the first modern computer was built in 1890 and every time it was a smooth transition and computers continued to advance exponentially not missing a beat. They point out that if anything the rate at which computers have advanced exponentially has itself increased. From 3 years in 1900 to 2 years in 1960 to 11 months now. They say that will continue.
Camp B
Thinks that when the current paradigm is over computers will no longer advance exponentially. That will cause people who support the singularity to start pushing back the date farther and farther essentially making it out of reach.
Now I am in camp A. Not because I want the singularity to happen so I hope computers keep advancing exponentially but because I've looked at the evidence on both sides and have no reason to think that after thousands of years of humans becoming smarter exponentially and computers advancing exponentially since the first one was built in 1890 it will suddenly stop now. I think science will make the smooth transition to the next paradigm and no one will notice a difference and computers will continue to advance exponentially and that rate itself will keep speeding up. Finally reaching a point that change will be occurring so fast that unless we merge with the computers we will not be able to keep up. That is the singularity and I predict that will be sometime between 2030 and 2045. 2030 being the date that most people will say it started while 2045 is the date engineers, like Ray Kurzweil, who have the definition tied to specific processing capabilities of computers say it will happen. In a way both are right.
Last edited by Josseppie; 06-24-2013 at 08:58 AM..
One of the things I try to stress is that the road to the singularity will be paved with many small steps. You can see that with the history of computers. We did not go from the main frame computers in the 1970's that took up an entire room to the I Phone in one step. It was many steps so by the time we got the I Phone it seemed normal. That will happen as computers start to get smaller and smaller and we merge with them. It will be many steps so by the time we fully merge with them it will seem normal and no big deal. The big difference is now the steps will happen faster and faster as we approach the singularity.
This is the latest example of one the steps from the New York Times.
As society struggles with the privacy implications of wearable computers like Google Glass, scientists, researchers and some start-ups are already preparing the next, even more intrusive wave of computing: ingestible computers and minuscule sensors stuffed inside pills.
Although these tiny devices are not yet mainstream, some people on the cutting edge are already swallowing them to monitor a range of health data and wirelessly share this information with a doctor. And there are prototypes of tiny, ingestible devices that can do things like automatically open car doors or fill in passwords.
I will say that the biggest hurdle that will determine if the singularity is science fact, at least in our lifetime, or becomes science fiction will be around 2020-2022 when the current paradigm, the intragrated circuit, runs out of steam. No one disagrees that will happen. The disagreement comes with what happens next.
There are two camps:
Camp A
Thinks we will have a smooth transition to the next paradigm, 3D self organizing molicular structures. They cite the history of computers and show that we have gone through a few paradigms since the first modern computer was built in 1890 and every time it was a smooth transition and computers continued to advance exponentially not missing a beat. They point out that if anything the rate at which computers have advanced exponentially has itself increased. From 3 years in 1900 to 2 years in 1960 to 11 months now. They say that will continue.
Camp B
Thinks that when the current paradigm is over computers will no longer advance exponentially. That will cause people who support the singularity to start pushing back the date farther and farther essentially making it out of reach.
Now I am in camp A. Not because I want the singularity to happen so I hope computers keep advancing exponentially but because I've looked at the evidence on both sides and have no reason to think that after thousands of years of humans becoming smarter exponentially and computers advancing exponentially since the first one was built in 1890 it will suddenly stop now. I think science will make the smooth transition to the next paradigm and no one will notice a difference and computers will continue to advance exponentially and that rate itself will keep speeding up. Finally reaching a point that change will be occurring so fast that unless we merge with the computers we will not be able to keep up. That is the singularity and I predict that will be sometime between 2030 and 2045. 2030 being the date that most people will say it started while 2045 is the date engineers, like Ray Kurzweil, who have the definition tied to specific processing capabilities of computers say it will happen. In a way both are right.
I happen to be in camp B. I couldn't find any info on the 3D self molecular replicating structures as it pertaining to computers. I found some scientific articles but they seem to talk about medical and biological aspects of it not computing.
Am I simply not typing in the correct key terms on google search?
I remember a few threads back another city data member had heard of it and posted the following:
Crystalblue cause last i heard, 3d is still to hot, and they have not overcome that yet.
and even if/when they do, how many can they stack up? too many wont work for laptops/tablets (and I am not ready yet to put all my stuff in the could).
Quantum has its own issues, I dont think it is supposed to go in the direction of AI.
He mentioned Quantum computing there as well because I had mentioned that we were getting close to achieving quantum computing.
I happen to be in camp B. I couldn't find any info on the 3D self molecular replicating structures as it pertaining to computers. I found some scientific articles but they seem to talk about medical and biological aspects of it not computing.
Am I simply not typing in the correct key terms on google search?
I remember a few threads back another city data member had heard of it and posted the following:
Crystalblue cause last i heard, 3d is still to hot, and they have not overcome that yet.
and even if/when they do, how many can they stack up? too many wont work for laptops/tablets (and I am not ready yet to put all my stuff in the could).
Quantum has its own issues, I dont think it is supposed to go in the direction of AI.
He mentioned Quantum computing there as well because I had mentioned that we were getting close to achieving quantum computing.
That seems to be our biggest disagreement.
Look at this article from the MIT news. I have posted it before but maybe you did not see it.
Self-assembling computer chips
Molecules that arrange themselves into predictable patterns on silicon chips could lead to microprocessors with much smaller circuit elements.
The features on computer chips are getting so small that soon the process used to make them, which has hardly changed in the last 50 years, won’t work anymore. One of the alternatives that academic researchers have been exploring is to create tiny circuits using molecules that automatically arrange themselves into useful patterns. In a paper that appeared Monday in Nature Nanotechnology, MIT researchers have taken an important step toward making that approach practical.
Look at this article from the MIT news. I have posted it before but maybe you did not see it.
Self-assembling computer chips
Molecules that arrange themselves into predictable patterns on silicon chips could lead to microprocessors with much smaller circuit elements.
The features on computer chips are getting so small that soon the process used to make them, which has hardly changed in the last 50 years, won’t work anymore. One of the alternatives that academic researchers have been exploring is to create tiny circuits using molecules that automatically arrange themselves into useful patterns. In a paper that appeared Monday in Nature Nanotechnology, MIT researchers have taken an important step toward making that approach practical.
Interesting article but as usual you're way to optimistic: Did you read this portion of the article:
Much more research is required, however, before self-assembling molecules can provide a viable means for manufacturing individual chips. Nearer term, Berggren and Ross see the technique’s being used to produce stamps that could impart nanoscale magnetic patterns to the surfaces of hard disks, or even to produce the masks used in conventional lithography: today, state-of-the art masks for a single chip require electron-beam lithography and can cost millions of dollars. In the meantime, Ross and Berggren are working to find arrangements of their nanoscale posts that will produce functioning circuits in prototype chips, and they’re trying to refine their technique to produce even smaller chip features.
Interesting article but as usual you're way to optimistic: Did you read this portion of the article:
Much more research is required, however, before self-assembling molecules can provide a viable means for manufacturing individual chips. Nearer term, Berggren and Ross see the technique’s being used to produce stamps that could impart nanoscale magnetic patterns to the surfaces of hard disks, or even to produce the masks used in conventional lithography: today, state-of-the art masks for a single chip require electron-beam lithography and can cost millions of dollars. In the meantime, Ross and Berggren are working to find arrangements of their nanoscale posts that will produce functioning circuits in prototype chips, and they’re trying to refine their technique to produce even smaller chip features.
I did but keep in mind this article was written in 2010, a full 10 to 12 years before the current paradigm runs out of steam. If you go back a decade before we had the integrated circuit not much was known about it either. The first year it was commercially viable was 1961 so you have to go back to 1950 and it was not until the late 1950's when much research actually started on it. So the fact they are making this kind of progress already is a good sign.
I did but keep in mind this article was written in 2010, a full 10 to 12 years before the current paradigm runs out of steam. If you go back a decade before we had the integrated circuit not much was known about it either. The first year it was commercially viable was 1961 so you have to go back to 1950 and it was not until the late 1950's when much research actually started on it. So the fact they are making this kind of progress already is a good sign.
True but I feel like we're dealing with far more complex ideas, from the article I got the sense that we're still a long way off.
I agree the problem is more complex but the computers to solve the problem are more advanced as well. All in all we are in the same boat we were in 1953. One paradigm was about to end so in order to keep computers advancing exponentially they needed to move to the next paradigm. I suspect that they had their skeptics as well. If anything we seem be ahead of the curve because I found nothing that talks about integrated circuit in 1950 only 1957. I have heard Ray talk about it and he thinks they will be ready by the second half of this decade and if they are doing this much now I think he could be right again.
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.