Data Net Blog

Data Net Blog

Data Net has been serving the California area since 1983, providing IT Support such as technical helpdesk support, computer support, and consulting to small and medium-sized businesses.

Moore’s Law Is Failing. What’s Next?

Moore’s Law Is Failing. What’s Next?

Moore’s Law may have been prophetic for its time, but it was bound to run out of steam eventually. In 1965, Gordon Moore predicted that transistors inside of a dense integrated circuit would double every 18 months, and at the time, it seemed like an ambitious prediction at best. All these years later, however, computing speeds are doubling every 18 months, just as predicted, but technology may have finally caught up with this prediction. How will technology’s growth change moving forward?

A Background on Moore’s Law

Following Moore’s prediction, he went on to co-found Intel, a company that has been extremely important to the production of semiconductors for the past 50 years. The first microprocessor created by Intel had 2,300 transistors, yet today’s microprocessors contain billions of transistors. It turns out this prediction was right on the money.

Why is Moore’s Law Staring at Obsolescence?

There is only so much you can pack into a microprocessor, particularly due to the sheer amount of space one has to pack full of computation. The issue is actually connected to the speed of light, a number which is finite and constrained. As a constant, it limits the number of computations a single transistor can handle. You can’t exceed the speed of light, as computing is essentially electrons moving through matter,  so the flow of bits is, as such, limited. It is impossible to create computation that moves faster than the physical universe allows, even with humanity’s best efforts to exceed it. Physicist James R. Powell has predicted that Moore’s Law will ultimately become obsolete by the time 2036 rolls around.

When you consider other limitations, such as those placed on hardware and cooling systems, the costs associated with creating these faster chips will eventually put an end to the consistent growth of these computing systems.

What Happens Then?

Chances are that humans will try to contest this issue, and when they find that they cannot exceed what is physically possible with microprocessors, we’ll start to see the emergence of Quantum Computing. Quantum computers use qubits, or quantum bits, for superposition and entanglement. The process itself involves the computer overcoming the miniaturization problems of traditional computing, granting computers the ability to solve problems that would normally take a 5nm microprocessor decades in a matter of minutes.

With certain technologies like artificial intelligence really taking off, it’s no surprise that innovation in computers and microprocessing will continue to evolve, but it will certainly look a bit different in just a few short decades. What do you think about these changes in computing? Are you excited for what the future holds? Be sure to let us know in the comments below.

Don’t Miss World Backup Day (But Don’t Wait For It...
Monitoring Your Network and Infrastructure Can Kee...
 

Comments

No comments made yet. Be the first to submit a comment
Guest
Already Registered? Login Here
Friday, 15 November 2024

Captcha Image

Network Audit

Our network audit will reveal hidden problems, security vulnerabilities, and other issues lurking on your network.

Sign Up Today!

News & Updates

Subscription services are all over the place these days, and we’re willing to bet that you have a couple of lingering subscriptions that you’ve either forgotten about or don’t care enough about to cancel immediately. The Federal Trade Commission, how...

Contact Us

Learn more about what Data Net can do for your business.

Data Net
2445 5th Avenue Suite 200
San Diego, California 92101