Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
CPU speed has flatlined since 2003. We've been through this already, do we really need to have this argument all over again?
Software engineer here, who cares about the CPU speed?
Sure, it would probably be a bit simpler to program if the speed doubled every year instead of cpu power but hey, software is advancing along massive parallelisation nicely and our algos are effectively doubling in processing power alongside the CPU bang for the buck doublings.
Software engineer here, who cares about the CPU speed?
Sure, it would probably be a bit simpler to program if the speed doubled every year instead of cpu power but hey, software is advancing along massive parallelisation nicely and our algos are effectively doubling in processing power alongside the CPU bang for the buck doublings.
No, but you mentioned Moore's Law, which pertains to hardware, not software.
Moore's law is continuing and when the integrated circuit comes to a end in the early 2020's we will just move to the next paradigm the 3d self organizing molecular circuit.
The proof is in the change computers have seen even in the past 10 years as they are much smaller and faster today then they were even in 2004. Exponentially. I mean in 2004 we did not have a smart phone as the I Phone came out a few years later and today we have Google glass. That is a huge leap in just 10 years. Then as the numbers are getting bigger the exponential rate will be faster so we will see even more changes in the next 10 years and that is why we will start to merge with computers by the mid 2020's as they get to the size of blood cells.
Last edited by Josseppie; 09-04-2014 at 09:48 AM..
Moore's law is continuing and when the integrated circuit comes to a end in the early 2020's we will just move to the next paradigm the 3d self organizing molecular circuit.
The proof is in the change computers have seen even in the past 10 years as they are much smaller and faster today then they were even in 2004. Exponentially. I mean in 2004 we did not have a smart phone as the I Phone came out a few years later and today we have Google glass. That is a huge leap in just 10 years. Then as the numbers are getting bigger the exponential rate will be faster so we will see even more changes in the next 10 years and that is why we will start to merge with computers by the mid 2020's as they get to the size of blood cells.
I will let the others read Herb Sutter's article "The Free Lunch is Over" and see for themselves that this has been debunked already. Moore's law is dead, or at least, its relevance to computer *performance measures* is dead.
I will let the others read Herb Sutter's article "The Free Lunch is Over" and see for themselves that this has been debunked already. Moore's law is dead, or at least, its relevance to computer *performance measures* is dead.
Moore's law is not dead.
Intel reveals 14nm PC, declares Moore's Law 'alive and well'
So why did you ignore my question on hacking? Now they're reporting that healthcare.gov was hacked, you don't think this will become an issue with implants and robots?
So why did you ignore my question on hacking? Now they're reporting that healthcare.gov was hacked, you don't think this will become an issue with implants and robots?
That is because a IT guy would be better able to answer this question them I am. I mean sure I guess that might happen or maybe we will do a better job of protecting them I am not sure how it will exactly work with robots. I mean knowing we will have the technology and being able to say how they will work exactly are two different things.
One of the things that I post a lot about is how by the mid 2020's computers will be the size of blood cells and becoming very powerful that by 2030 they will be thousands of times more powerful then the smart phone today. People have asked how that will work. Well today they are already making progress on it and when I saw this story I had to post it. Again this is today and with it advancing exponentially imagine how this will be in 6-10 years!
This is from Science Daily:
Using only a few ingredients, a biophysicist and his team have successfully implemented a minimalistic model of the cell that can change its shape and move on its own. They describe how they turned this goal into reality in a new article.
Yeah what about it? A "law" that says that if you can't parallelizise then paralellizising won't have a big effect... Couldn't have figured that one out by myself.
The article "The Free Lunch is Over" seems (just skimmed it) right. But only about CPU speed, not bang for the buck (CPU power per dollar). So it debunks nothing about Moore's law. Interesting in 2004 maybe.
Moore's law will end (as it is about transistors) but the "law" (call it what you want, most people use the term Moore's law) that computers getting more powerful exponentially (and even the exponent is augmenting) will continue for decades.
IBM bets on graphene to replace transistors for example (LINK)
[edit] what was that hacking stuff about? I'm a computer guy so I might have some answers.
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.