Technology keeps on progressing as the years go by. Back then we were still watching black and white films through Betamax and VHS tapes but now we can live stream our own videos using just our phone in all colors of the color spectrum – a smartphone that can do all sorts of things you don’t need individual gadgets to do the job anymore. And no other better example of seeing these advancements in action than seeing how computers have evolved through the years.
Computers of yesteryears were bulky and used floppy disk drives you’ll never see used anymore in modern computers. Computer hardware is essentially the backbone of any computing device. What use is sophisticated software if there’s no sturdy and functional hard drive to house and protect it. Hard drives can have issues, however. But overall, they still somehow look the same and contain the same parts, albeit just smaller. Experts nowadays wish for more innovation in computer hardware because not much has changed over the decades.
Dumping Moore’s Law is perhaps the best thing that could happen to computers, as it’ll hasten the move away from an aging computer architecture holding back hardware innovation.
That’s the view of prominent scientist R. Stanley Williams, a senior fellow in the Hewlett Packard Labs. Williams played a key role in the creation of the memristor by HP in 2008.
Moore‘s Law is an observation made by Intel co-founder Gordon Moore in 1965 that has helped make devices smaller and faster. It predicts that the density of transistors would double every 18 to 24 months, while the cost of making chips goes down.
Every year, computers and mobile devices that are significantly faster can be bought with the same amount of money thanks in part to guidance from Moore‘s Law. The observation has helped drive up device performance on a predictable basis while keeping costs down.
But the predictions tied to Moore‘s Law are reaching their limits as it becomes harder to make chips at smaller geometries. That’s a challenge facing all top chip makers including Intel, which is changing the way it interprets Moore‘s Law as it tries to cling on to it for dear life.
Over the years, Moore’s Law influenced technological advancements around us, even the smartphones and other smart gadgets you are now holding. However, as we seem to have reached very advanced and sophisticated technology, making things even smaller does not guarantee that they will also be better and faster. Moore’s Law stated that the power of any device doubles about every two years but in a way, it has restricted innovation. But today, experts in the field realizes that for computing to progress, they will have to change the way things are done without sacrificing all the progress we have made so far.
The end of Moore‘s Law will bring creativity to chip and computer design and help engineers and researchers think outside the box, Williams said. Moore’s Law has bottled up innovation in computer design, he hinted.
So what’s next? Williams predicted there would be computers with a series of chips and accelerators patched together, much like the early forms of superfast computers. Computing could also be memory driven, with a much faster bus driving speedier computing and throughput.
The idea of a memory-driven computer plays to the strength of HPE, which has built The Machine along those lines. The initial version of The Machine has persistent memory that can be used as both DRAM and flash storage but could eventually be based on memristor, an intelligent form of memory and storage that can track data patterns.
Memory-driven computing could also break down the current architecture-based and processor-centric domination of the computer market. In the longer term, neuromorphic chips designed around the way the brain works could drive computing.
(Via: http://www.computerworld.in/news/its-time-dump-moores-law-advance-computing)
We have witnessed computers getting better, more powerful and faster every two years. But considering how small we have made these transistors powering most modern computers, experts are now looking into other ways to continue our progress in the field of computing. If we let things happen as it is now, experts predict the occurrence of the event known as “singularity,” when machines are now capable of near-human intelligence. While others are quite sure that such an event is looming on the horizon, skeptics are countering it saying that Moore’s Law is no longer the authority in the computing field we used to look up to.
But all these things don’t really matter for the average Joe. What people notice is that computers now are getting better at parallelization (similar to how the brain works) through multi-core processors sold in the market today. Then, there’s also the future of quantum computing, which may be decades from now before the public can access but still a possibility that looms on the horizon. We can only wait and see what the future in computing holds for all of us.