• Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    1 day ago

    AFAIK the smallest usable atom is about 150 picometer carbon, and the smallest amount of atoms theoretically possible to make a transistor is 3, so there is (probably) no way to go below 450 picometer. There is probably also no way to actually achieve 450 picometer which is the same as 0.45 nanometer.
    So the idea that they are currently going below 2nm is of course untrue, but IDK what the real measure is?

    What they are doing at the leading chip manufacturing factories is amazing, so amazing it’s kind of insane. But it’s not actually 2nm.

    Just for info, one silicon/silicium atom is 0.2 nm.

    • chonomaiwokurae@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      The whole idea of somehow representing different nodes and their development with one number is a bit silly. That being said, it looks like future channel materials could be 0,7 nm in thickness (monolayer WX2).

    • jj4211@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      For a while now the “nm” has been a bit of a marketing description aiming for what the size would be if you extrapolated the way things used to be to today. The industry spent so long measuring that when the measurement broke down they just kind of had to fudge it to keep the basis of comparison going, for lack of a better idea . If we had some fully volumetric approach building these things equally up in three dimensions, we’d probably have less than “100 pm” process easily, despite it being absurd.

        • jj4211@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          As I said. It’s an extrapolation of the rules from once upon a time to a totally different approach. It’s marketing and increasingly subjective. Any number can “make sense” in that context. The number isn’t based on anything you could actually measure for a long time now, it’s already a fiction, so it can go wherever.

        • AA5B@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          People have accepted heat pumps as 400% efficient. This is the same.

          And realistically, how do you describe in an approachable way “you experience what would look like an impossible number if we had continued as before”, where the “if” is key, as is “you experience”

          • Bradley Nelson@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            For what it’s worth, I think the heat pump measurement makes way more sense. What I want is to heat my house. I give you one watt hour and you give me 4 watt hours of heat. Sounds like 400% to me.

            The real issue here is that for the most part the measurements never meant anything for silicon chips. At least too end users.

    • Hadriscus@jlai.lu
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      That’s super cool. I’m asking as a total layman, what’s preventing the use of subatomic particles as transistors ?

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 day ago

        In the future light may be a possibility, and light is merely a photon, and you can have photons basically follow the same paths in each direction simultaneously without colliding.
        So without in any way being an expert, I would think that if light can somehow be controlled precisely enough, that would be a possibility to go way below what any atom can. Even if the paths need to be directed by atoms.
        But AFAIK there is not a practical working model for that yet, although research on it has been going on for decades.

          • Cocodapuf@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 day ago

            No no, quantum computing is more about using the quantum properties of particles to do computing in ways that you simply can’t with traditional computers. If you write your program to accommodate this kind of computing, you can essentially design programs to test all possible outputs simultaneously - a pretty neat trick.

            Right now we’re talking about photonic computing, simply using photons as the circuitry within a processor rather than electronic circuits using elections.

            Though I’m not an expert on either, so I’m probably the wrong person to ask for more information on the subject.

      • theparadox@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        Not an expert but… typical computers do what they do by transmitting (primarily) electrical signals between components. Is there electricity or isn’t there. It’s the “bit” with two states - on or off, 1 or 0. Electricity is the flow of electrons between atoms. Basically, we take atoms that aren’t very attached to some of their electrons and manipulate them so that they pass the electrons along when we want them to. I don’t know if there is a way to conduct and process electrical signals without using an atom’s relationship with its electrons.

        Quantum computing is the suspected new way to get to “better” computing. I don’t know much about the technical side of that, beyond that they use quantum physics to expand the bit to something like a qubit, which exploits superposition (quantum particles existing in multiple states simultaneously until measured, like the Schrodinger’s cat metaphor) and entanglement (if two quantum particles’ states are related to or dependent on each other, determining the state of one particle also determines the state of the other) to transmit/process more than just a simple 1 or 0 per qubit. A lot more information can be transmitted and processed simultaneously with a more complex bit. As I understand it, quantum computing has been very slow going.

        That’s my shitty explanation. I’m sure someone will come along and correct my inaccurate simplification of how it all works and list all that I missed, like fiberoptic transmission of signals.

        • bunchberry@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          17 hours ago

          The reason quantum computers are theoretically faster is because of the non-separable nature of quantum systems.

          Imagine you have a classical computer where some logic gates flip bits randomly, and multi-bit logic gates could flip them randomly but in a correlated way. These kinds of computers exist and are called probabilistic computers and you can represent all the bits using a vector and the logic gates with matrices called stochastic matrices.

          The vector necessarily is non-separable, meaning, you cannot get the right predictions if you describe the statistics of the computer with a vector assigned to each p-bit separately, but must assign a single vector to all p-bits taken together. This is because the statistics can become correlated with each other, i.e. the statistics of one p-bit depends upon another, and thus if you describe them using separate vectors you will lose information about the correlations between the p-bits.

          The p-bit vector grows in complexity exponentially as you add more p-bits to the system (complexity = 2^N where N is the number of p-bits), even though the total states of all the p-bits only grows linearly (complexity = 2N). The reason for this is purely an epistemic one. The physical system only grows in complexity linearly, but because we are ignorant of the actual state of the system (2N), we have to consider all possible configurations of the system (2^N) over an infinite number of experiments.

          The exponential complexity arises from considering what physicists call an “ensemble” of individual systems. We are not considering the state of the physical system as it currently exists right now (which only has a complexity of 2N) precisely because we do not know the values of the p-bits, but we are instead considering a statistical distribution which represents repeating the same experiment an infinite number of times and distributing the results, and in such an ensemble the system would take every possible path and thus the ensemble has far more complexity (2^N).

          This is a classical computer with p-bits. What about a quantum computer with q-bits? It turns out that you can represent all of quantum mechanics simply by allowing probability theory to have negative numbers. If you introduce negative numbers, you get what are called quasi-probabilities, and this is enough to reproduce the logic of quantum mechanics.

          You can imagine that quantum computers consist of q-bits that can be either 0 or 1 and logic gates that randomly flip their states, but rather than representing the q-bit in terms of the probability of being 0 or 1, you can represent the qubit with four numbers, the first two associated with its probability of being 0 (summing them together gives you the real probability of 0) and the second two associated with its probability of being 1 (summing them together gives you the real probability of 1).

          Like normal probability theory, the numbers have to all add up to 1, being 100%, but because you have two numbers assigned to each state, you can have some quasi-probabilities be negative while the whole thing still adds up to 100%. (Note: we use two numbers instead of one to describe each state with quasi-probabilities because otherwise the introduction of negative numbers would break L1 normalization, which is a crucial feature to probability theory.)

          Indeed, with that simple modification, the rest of the theory just becomes normal probability theory, and you can do everything you would normally do in normal classical probability theory, such as build probability trees and whatever to predict the behavior of the system.

          However, this is where it gets interesting.

          As we said before, the exponential complexity of classical probability is assumed to merely something epistemic because we are considering an ensemble of systems, even though the physical system in reality only has linear complexity. Yet, it is possible to prove that the exponential complexity of a quasi-probabilistic system cannot be treated as epistemic. There is no classical system with linear complexity where an ensemble of that system will give you quasi-probabilistic behavior.

          As you add more q-bits to a quantum computer, its complexity grows exponentially in a way that is irreducible to linear complexity. In order for a classical computer to keep up, every time an additional q-bit is added, if you want to simulate it on a classical computer, you have to increase the number of bits in a way that grows exponentially. Even after 300 q-bits, that means the complexity would be 2^N = 2^300, which means the number of bits you would need to simulate it would exceed the number of atoms in the observable universe.

          This is what I mean by quantum systems being inherently “non-separable.” You cannot take an exponentially complex quantum system and imagine it as separable into an ensemble of many individual linearly complex systems. Even if it turns out that quantum mechanics is not fundamental and there are deeper deterministic dynamics, the deeper deterministic dynamics must still have exponential complexity for the physical state of the system.

          In practice, this increase in complexity does not mean you can always solve problems faster. The system might be more complex, but it requires clever algorithms to figure out how to actually translate that into problem solving, and currently there are only a handful of known algorithms you can significantly speed up with quantum computers.

          For reference: https://arxiv.org/abs/0711.4770

        • nightlily@leminal.space
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          Quantum computing can’t achieve better outcomes for general computing problems than classical computing can. It’s just possible to do particular kinds of algorithms with it (like Shor‘s Algorithm for factorising prime numbers) that classical computing can’t do. It’s still a lot of smoke and mirrors at the moment though.

          • Cocodapuf@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            Ok, that paper is pretty fabulous. That does make for a good sanity check for quantum computing feasibility.

            That said, don’t be surprised when these things catch up quickly!

        • Hadriscus@jlai.lu
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          Cheers ! so supposedly there has to be some tangible matter (atoms) to form the transistors ?

          Quantum computing is exciting, I remember a magazine my parents had in the early 90s, that had a cover like “quantum computers ! soon !” lol