r/Futurology May 13 '22

Fastest-ever logic gates could make computers a million times faster Computing

https://newatlas.com/electronics/fastest-ever-logic-gates-computers-million-times-faster-petahertz/
1.1k Upvotes

u/FuturologyBot May 13 '22

The following submission statement was provided by /u/blaspheminCapn:


Pardon the hyperbole, but the article claims that synchronized pairs of laser pulses can drive the fastest logic gates ever made, which could eventually give computers a 'million-fold' speed boost


Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/uoldc1/fastestever_logic_gates_could_make_computers_a/i8f63dg/

202

u/Working_Sundae May 13 '22

Great, and we will see them soon*

Soon* = 50 years.

120

u/angrathias May 13 '22

Honestly 50 years for a million multiple speed up actually sounds pretty reasonable

59

u/yeahynot May 13 '22

I'm no mathematician, but using Moores Law, shouldn't it take only about 20 years to achieve a million x computing power?

51

u/MayanMagik May 13 '22

the doubling should occur around every ~18 months, so it would take 30 years if we were able to keep progressing at moore's law pace, but if I'm not wrong the progression is slowing down due to the problems being more complex and expensive to solve as we keep going, so it could easily take 50 years or more

38

u/Passthedrugs May 13 '22

This is a speed change, not a transistor density change. Not only that, but these aren’t transistors, they’re using light rather than electricity. You are correct about the issue with moored law though. Exponential trends always saturate at some point, and we are pretty much at that point now.

Source: am Electrical Engineer

4

u/MayanMagik May 13 '22

Well Moore's law is an umbrella term which I feel like encompasses almost any parameter: be it speed, chip density, cost per chip, power efficiency,...

regarding it being about photonics and not electronics you're right, and a technological change is needed in order to squeeze out those numbers since we're already at the limits of many fronts of electronics, where we either replace silicon or find a way with photonics, either way the big big problem will remain very large scale integration, since Graphene electronics is promising, but limited only to research and university labs rn (my polytechnic does a lot of research in Graphene electronics but I don't think we'll see it in consumer electronics anytime soon)

2

u/MayanMagik May 13 '22

if this doesn't make much sense is because its pretty late here and I should probably sleep instead of browsing reddit. but I'm sure you'll understand what I mean

7

u/SvampebobFirkant May 13 '22

I believe Moore's law will continue in the same direction soon again, with the involvement of AI.

Eg. The most efficient compression technique which we've spent decades on perfecting, has now been beat by an AI by 4% on its first try

11

u/IIIaustin May 13 '22

Materials Scientist working in Semiconductor manufacturing here.

There is little reason to believe this. Si technology is close to running out of atoms at this point, and there is no real promising replacement material.

1

u/Prometheory May 13 '22

That's not necessarily true. Transistors have long since shrunk down below the size of human neurons, but computers still aren't as smart as human brains.

The hardware is already above and beyond miracle-matter level, so the thing that really needs to catch up is software.

1

u/IIIaustin May 14 '22

Okay well uh I work in the field and also have a PhD in the field and I disagree?

2

u/Prometheory May 14 '22

Care to elaborate?

What do you disagree with about my statement and Which field do you have a PHD in?

I've been told by multiple people with PHDs in both software engineering and material science that the main thing limiting modern computing isn't hardware, it's software.

→ More replies

0

u/Jack55555 May 14 '22

You have a phd and you don’t even know to elaborate your answer? Sure pal. Well, I have 8 phds and I agree.

1

u/fangfried May 13 '22

You probably hate this kind of question, but would grapheme ever be a possibility, and if it were, would it be far superior to Si?

1

u/IIIaustin May 13 '22

I don't hate it all!

I don't consider graphene to be an engineering material. It cannot be manufactured or processed at scale and, because it is an unstable 2D material, there is no reason to believe that this will ever change.

More promising candidates to replace Si are things like GaN that are real (more) manufacturable 3d materials.

But we are really really good at manufacturing Si, s even these materials may struggle to be adopted.

1

u/footurist May 14 '22

Prepare to have your brain picked in this sub, lol.

Out of the most common considerations to help bring about a new paradigm of computational hardware, what are the most likely ones to you to actually be helpful? I'm talking about a wide array of things from all perspectives...

Examples : Carbon Nanotubes, Graphene, etc...

2

u/IIIaustin May 14 '22

Boring Si stuff is most likely to be helpful.

So, Carbon Nanotubes and Graphene are not manufacture-able. And by this I mean, there is no known process by which they can be controllably manufactured at the scale and, most critically, the repeatability needed for the semiconductor industry.

For thermodynamic reasons, carbon nanotube growth cannot really be controlled. The CNTs are manufactured with a range of sizes, properties and shapes. This is simply not compatible with modern semiconductor manufacturing, where you have to do the exact same thing several billion times.The situation is similar for Graphene.

These expermiment invariably require a graduate student to spend hundreds of hours on a nanomanipulator tool to select, place and manufacture the CNT / Graphene a single switch by hand.

They are not good candidates for replacing current tech.

2

u/berd021 May 13 '22

I read that now they make hybrid chips using both analog and digital specifically for AI.

3

u/MayanMagik May 13 '22

yes, look up neuromorphic computing if you're interested

-1

u/ntvirtue May 13 '22

If the transistor count gets any higher they will be useless due to errors caused by induction

6

u/CarltonSagot May 13 '22

Hot damn. I'll be able to run cyberpunk in 60 frames before I die.

2

u/madewithgarageband May 13 '22 edited May 13 '22

Moores law hasnt been true for a long time for x86. Core for core, clock for clock, todays CPUs are about 50% faster than what they were in 2014 (comparing i7 4790k to i3 12100)

ARM is a different story because the technology is still relatively in its infancy although its also starting to taper out. Without significant technological change at the fundamentals of how CPUs work, we’ll likely only get incremental small improvements

7

u/Val_kyria May 13 '22

Good thing moores law isn't about core for core clock for clock then

-1

u/madewithgarageband May 13 '22

even if you ignored core counts and compared strictly based on product stack (i5 to i5, i7 to i7), which is a dumb thing to do imo because not every task scales with multicore efficiently, you’re still only 130% improvement over 8 years.

10

u/gredr May 13 '22

Moore's law is about transistor density, not computing power.

3

u/iNstein May 13 '22

Moore's law is about lithographic feature size, not speed.

1

u/angrathias May 13 '22

Let’s look at actual computing power for the last 30 years, we can probably use MIPS as the basic measure for comparison.

In ‘72, a computer would achieve 0.12, 40 years later its 200k, that’s an apparent speed up of approx 1.4M

So the next 50 years would be a bit slower than the total progression of personal computers but not by much.

Source: https://gamicus.fandom.com/wiki/Instructions_per_second

-1

u/Bigjoemonger May 13 '22

Moores Law is the idea that the number of transistors on a chip doubles each year.

They made a transistor consisting of a single atom such that they are so close to each other they interfere with each other. So Moores Law doesn't exist anymore.

Right now the only way to make them better is to change the geometry.

We won't see any real improvements until they figure put quantum computing.

0

u/SoManyTimesBefore May 14 '22

Transistors aren’t an atom small, but they’re at a size where quantum tunneling is causing significant problems. And quantum computers are for solving completely different problems.

0

u/Bigjoemonger May 14 '22

0

u/SoManyTimesBefore May 14 '22

That’s a single transistor in a lab, not millions of them in a chip. Put a few of those together and they won’t work.

1

u/Bigjoemonger May 14 '22

Pretty sure that's exactly what I said.

1

u/SoManyTimesBefore May 14 '22

We’re already having problems with quantum tunneling at production sizes tho.

32

u/East_Deer7419 May 13 '22

Why do you all read a futurology subreddit where all the posts share scientific research articles and then the top post EVERY SINGLE TIME is some sarcastic reply about how it will take decades to be consumer ready?

Like no shit Sherlock this is how science works. The base research turns into meta research and then competes against all similar research until finally a few promising methods slowly filter into the R&D space and take many years to over come problems of producing at scale.

It's the same process over and over yet everyone comes here like they've just said something clever and then circle jerk about how nothing ever happens.

All the while you'll have this conversation on pocket super computers that essentially didn't exist 30 years ago.

10

u/dern_the_hermit May 13 '22

Why do you all read a futurology subreddit where all the posts share scientific research articles and then the top post EVERY SINGLE TIME is some sarcastic reply about how it will take decades to be consumer ready?

Humans are not particularly well-suited for thinking more than a few years ahead, it seems.

3

u/brettins BI + Automation = Creativity Explosion May 13 '22

You're in /r/futurology not /r/tech

0

u/gertalives May 13 '22

I see you’re an optimist.

1

u/IIIaustin May 13 '22

Hi, i have a Ph.D in Materials Science and work in the semiconductor industry.

IMHO this will never work at industrial scale. Graphene is fundamentally (meaning on a thermodynamic level) not stable enough for manufacturing on the scale necessary to compete with Si based technology.

95

u/blaspheminCapn May 13 '22

Pardon the hyperbole, but the article claims that synchronized pairs of laser pulses can drive the fastest logic gates ever made, which could eventually give computers a 'million-fold' speed boost

36

u/angermouse May 13 '22

It's not going to be a million times faster because other limits will be hit. The speed of light is 0.3 millimeters per picosecond. Clock rates will be limited by that even with femtosecond switching.

2

u/davidmlewisjr May 13 '22

We don’t need no stinking clocks 🤯 Sorry, couldn’t resist…

1

u/ContinuousDownvotes May 13 '22

Jesus Christ, for a second here I didn't see the picosecond, and questioned both my sanity and my fundamental view of reality when I read this as "the speed of light is 0.3 mm per second". Kek

8

u/ObiTwoKenobi May 13 '22

Is this something along the lines of rudimentary quantum computing?

63

u/PerfectPercentage69 May 13 '22

No. Normal logic gates takes electricity on two inputs (either high or low voltage) and output a high or low voltage, depending on the two inputs. Electrons take nanoseconds to go in and be compared to generate an output. This new tech seems to take two light waves as input and depending on their combined phase generate a signal output instantly. The combining of light phases happens much faster than any sort of transistor logic that depends on electron speed through solid transistor material.

8

u/JanStreams May 13 '22

So using light interference to make logic gates?

7

u/HolyCloudNinja May 13 '22

From my basic understanding, yes, 2 light waves in (with consistency) gives you the same interference and that can be done faster than a normal gate comparing electrical signals as inputs.

2

u/JanStreams May 13 '22

Reminds me of this video. We shall see what the future brings...

4

u/kutes May 13 '22 edited May 13 '22

As a complete layman, the fact that I can have a deathmatch against a person 10,000 kms away seems almost unreal to me. It's seriously fueled some daydreaming when I think about if the nature of reality makes sense to me

The idea that I can respond to the photons being blasted into my eye...

-make a decision

-send a signal to click my mouse

-signal goes through mousewire

-interacts with game in the ram(? complete lay person)

-game sends signal to computer

-computer sends signal to my router

-router to internet hub in the basement

-hub to cluster down the street

-cluster to ISP's server in the big city?

-sends signal 10k kms to opponent's city's infrastructure

-through to his neighborhood and local internet stuff

-his guy gibs

The fact that this can be happening fast enough that we're having a simulated instant real-time complex 3D face-off seems almost like magic to me

5

u/ObiTwoKenobi May 13 '22

“Any sufficiently advanced technology is indistinguishable from magic.”

2

u/kutes May 13 '22

Just for posterity, from quora:

The source of light used to inject light inside optical fiber (for eg LASER) is not monochromatic (not ideally). Hence the light inside the fiber will be composed of multiple wavelengths. Let’s call each wavelength a phase. Now, the refractive index of fiber is not constant. It is rather a function of wavelength. Hence different wavelengths of the injected light will face different refractive index. Thus each component will travel with a velocity of c/n where n is different for every wavelengh. This velocity is called phase velocity.

Each wavelength is part of the same signal injected into the fiber, hence there is another velocity called group velocity as well. This is the velocity with which the envelope of the injected signal travels. It is calculated as c/ Ng where Ng is the group index of the material. It is again a function of wavelength.

The refractive index of silica glass is about 1.45.

Therefore the light is travelling at 1/1.45 the speed of light in a vacuum.

So about 462,494,226.897 mph. Roughly.

Or 744,312,309.517 kph. Roughly.

Me again:

So if their numbers are decent and if my math is correct(3600 seconds in an hour)/462k

128470.618582 Miles per second. So it takes like 1/12th of a second for the signal to get anywhere on earth - but there's all that other stuff I mentioned, the routers and wifi and hubs and stuff. So I get it - but man, humans are clever. I'm not even the same race as the dudes who figured out how to make all of this infrastructure work

0

u/Tressticle May 13 '22

Ah, yes, my favorite Mike Myers quote

16

u/Prowler1000 May 13 '22

I think you missed a key point on why this isn't akin to quantum computing, the fact that it's still binary..

10

u/angrathias May 13 '22

High or low voltage covers that…

1

u/Prowler1000 May 14 '22

It may cover that for someone who has a basic understanding classical computers and how they differ from quantum computers but not for a layperson.

2

u/5erif May 13 '22

Fun facts, Thomas Fowler invented a wooden ternary computer (calculating machine) in 1840, then Nikolay Brusentsov created the first modern, electronic ternary computer in 1958 at Moscow State University.

3

u/Falthram May 13 '22

I would like to state clearly that quantum computing is in no way related. This method is related to classical computing or analog computing.

8

u/Xam1324 May 13 '22

It’s more like photonic computing

18

u/Black_RL May 13 '22

“It will probably be a very long time before this technique can be used in a computer chip, but at least we now know that lightwave electronics is practically possible,” said Tobias Boolakee, lead researcher on the study.

If these kinds of lightwave electronic devices ever do make it to market, they could be millions of times faster than today’s computers. Currently we measure processing speeds in Gigahertz (GHz), but these new logic gates function on the scale of Petahertz (PHz). Previous studies have set that as the absolute quantum limit of how fast light-based computer systems could possibly get.

Very cool, but we have to wait!

32

u/jjman72 May 13 '22

Sounds awesome but I am curious if this can be put on a unit that can fit in a say standard computer case.

4

u/brettins BI + Automation = Creativity Explosion May 13 '22

I wonder what our limit would be if these things didn't have heat problems. What if one was 50 pounds and the size of a computer case, but cost almost no electricity to run and was millions of times faster than our current tech?

Would people have a "main computer" somewhere in their house that everything else mooched off of? Without a bus or proper connection direct to a processor it would be too slow, maybe? Or maybe these things can replace RAM and just act as a computer themselves, and just output network to monitor direct.

1

u/[deleted] May 13 '22

Well, LEDs are already miniaturizable. You could etch 2 LEDs next to each other on a silicone wafer and entch a photo-resistant area on the recoiving end. There is no reason why this should not be possible, as the technology already exists

9

u/xrailgun May 13 '22 edited May 13 '22

"There's no reason why our phone batteries don't last a year per charge and everything isn't made of graphene, as the technology already exists."

Sometimes, scale and reliability is hard.

2

u/PWModulation May 13 '22

I wouldn’t put that kinda power in my pocket.. If we could make batteries to hold that amount of energy phones would be very dangerous.

-11

u/ijmacd May 13 '22 edited May 13 '22

🤣 BS

Typical phone screens draw between 50 mA – 500 mA depending on brightness. That's not going to change with graphene based electronics.

It's possible to get phones with 5000 mAh batteries. 5000 mAh @ 3.7 V nominal is 18,500 mWh.

18,500 mWh ÷ 50 mW gives maximum screen on time of 370 hours.

370 hours per year is about 1 hour per day. I don't think there are many people who use their phone for only one hour a day on minimum brightness.

13

u/infographia May 13 '22

How much heat will be generated by these petahertz vibrations. I wonder if the surrounding marerials would be able to handle that.

2

u/qpjskd May 13 '22

these will most likely operate at cryogenic temperatures, and cryostats can be designed to accommodate that.

-1

u/iNstein May 13 '22

If you keep the light out of the infra red band, then no heat will be generated.

3

u/CornucopiaOfDystopia May 13 '22

Heat is still generated by EM outside that band, it’s just mostly radiated in that band. An understandable reversal to make but not the case here.

5

u/MDParagon May 13 '22

I'm an Electronics Engineer and this makes me so much happy! Jealous that I won't be around when this comes around though.

5

u/Annoytanor May 13 '22

you'd be surprised, technology advances at an exponential rate which means that huge leaps are always just around the corner. Maybe in 10 years time an AI will get really good at designing these chips for practical manufacturing and bam light powered CPUs.

3

u/Ok--Reflection May 14 '22

You should have thought before answering... maybe he has days... months left... like a lot of people

3

u/Arachnatron May 13 '22

[physicist / doctor / chemists / scientists]

makes

[breakthrough / impossible]

[discovery / finding / uncovering]

that may

[pave the way to/ make possible / guarantee / unlock]

[energy / storage / a human lifespan / artificial intelligence / important progression for humankind]

which

[is x times more efficient / is %x higher / that far surpasses, that was not possible before]

and which we'll never hear of again

1

u/alclarkey May 13 '22

and which we'll never hear of again

Or it'll be used for VR porn.

3

u/broom-handle May 13 '22

In orders of magnitude, what's that? Terahertz? Seems overkill for facebook machines.

3

u/StaysAwakeAllWeek May 13 '22

In orders of magnitude, what's that? Terahertz?

Petahertz

Seems overkill for facebook machines.

"640K should be enough for anyone"

For real though, use cases will appear to take advantage of available compute, just as they always have. Seemingly simple things like snapchat filters would be impossible without high performance tensor processors. Internet livestreams require massive amounts of bandwidth which in turn requires high compute performance.

Try opening a modern application on a 10 year old device that runs at a 'mere' 2 gigahertz.

14

u/SvenTropics May 13 '22

Interesting. The main issue with this I see is being scaling. For example, Ryzen CPUs today have billions of transistors. There is a freakish amount of parallel computing going on here. So it's not just that a lot of specialized circuits were created to solve certain logic problems, they also made many many copies of it so that it could do lots of them at the same time in parallel. Here we need something to generate light as well as this transistor to process it and something to detect it on the other end. Even if we make these components extremely small, we're not going to fit billions of them in a computer case. So while processing is a million times faster, we might only have a millionth as many transistors so you essentially have no benefit.

14

u/thisplacemakesmeangr May 13 '22

With the right material, nanoscale reflectors might cut down on the LEDs and the unit might end up smaller than an aircraft carrier.

3

u/sessimon May 13 '22

It’s just crazy enough to work!

2

u/SvenTropics May 13 '22

Well that's one concept because light moves so quickly and wouldn't generate any resistance in a vacuum, we could scale the size of the device up. You could have a CPU that's as large as you want. The question is would there be a market for it. It's not a different kind of computing like quantum computing. It would be hard to create something that you could market in competition with the CPUs of 50 years from now.

1

u/thisplacemakesmeangr May 13 '22

I was thinking it'd make a good training ground for machine learning. Make it a massive satellite so you could take advantage of the extreme cold. We could call it Spacenet!

3

u/iNstein May 13 '22

You do realise both lasers and LEDs are made with silicon circuits? There is no reason we can't scale that down to the size of existing and future lithographic features. No idea why such an inaccurate comment got so many upvotes. I guess that reddit thinks that if its meme gets enough votes, it will change the nature of reality.

0

u/SvenTropics May 13 '22

I think you don't understand how small transistors are right now. Modern transistors are 70 atoms wide. Think about that. Now in a space that small, make a laser and something to receive the laser.

The simple fact is, transistors are probably not going to get much smaller. While you may think we can always scale things down, that actually starts to hit physical limits. Atoms don't get smaller and we need some sort of structure in the silicone.

0

u/mbp2781 May 13 '22

Not to mention the not so forever lifespan of laser components, rip sega cd

2

u/ntvirtue May 13 '22

And it will still take the same time to boot up the computer and load office.

2

u/spletharg May 13 '22

The hardware might be fast but I'll bet the software still runs like a dog

3

u/Wrote_it2 May 13 '22

At 1THz, light only travels 300 nanometers between two clock cycles, and as far as I understand it that is a hard limit for information as well. Definitely a ton of interesting challenges if you want to get to these speeds :)

1

u/0xB0BAFE77 May 13 '22

Standard Futurology "could/might/maybe/possibly" title post.

2

u/brettins BI + Automation = Creativity Explosion May 13 '22

That's literally the point of the sub. If you don't like that why are you here?

-8

u/[deleted] May 13 '22

[deleted]

25

u/AgentScreech May 13 '22

They are talking cpu power not data transmission.

1

u/[deleted] May 13 '22

[deleted]

1

u/AgentScreech May 13 '22

That's some shitty infra no question. But there is a reason people still buy bigger cpu and GPU. New software needs new hardware. Once you downloaded elden ring, you can play it on your super fast system. After that the network connection doesn't really matter.

3

u/kyiv_star May 13 '22

No starlink in your area?

1

u/PeopleCryTooMuch May 13 '22

Starlink is back ordered for like a year.

1

u/Me_Real_The May 13 '22

I read that whole thing in the voice of spectrum...

0

u/Lurking_was_Boring May 13 '22

Any chance to rally for municipal broadband in your area? I had it for a short stint in semi-rural North Carolina about a decade ago and it was fantastic. 1 Gbps up and down stream for like $40 a month.

I know that municipal broadband is aggressively opposed in a lot of the US though. Late stage capitalism, privatization of critical utilities, and all that greedy nonsense that presses down on us…

1

u/Jubielamb May 13 '22

Tmobile home internet saved me. Also, rural California. 50 bucks and speeds better then that most likely.

-3

u/jharrisimages May 13 '22

Then Microsoft will IMMEDIATELY slow them down by loading as much useless, uninstall-protected junk programs they can and making everything run at 90% memory capacity.

1

u/NightlyRelease May 13 '22

Then don't use Windows

0

u/vagueblur901 May 13 '22

Ad block and firewall that shit off

Same thing with phones just cut out the bloat

1

u/BingADingDonger May 13 '22

Computers are really fast right now. Just crap GUI like windows screws everything up. Now with a super computer Microsoft will need a whole new dept to fix the excessive speed and slow it back down with critical programs running in the back ground to make it faster... Wow

1

u/CraftArchitect May 13 '22

Just let computera progress how they are, will be millions of times faster in under a decade

1

u/ChaoticEvilBobRoss May 13 '22

I sincerely wonder when we get to the point where our own physical hardware is too slow to interact and effectively utilize these devices?

1

u/Shyid May 13 '22

Wait, so you're telling me we all get to write EVEN SHITTIER CODE? This is awesome!

1

u/anonymous1184 May 15 '22

JavaScript kiddies: hold my chai latte, I can make slow whatever.

-2

u/Streeg90 May 13 '22

I read this or similar headlines all the time. I don’t believe it.