LHCb discovers matter-antimatter asymmetry in charm quarks

(symmetrymagazine.org)

270 points | by rbanffy 1863 days ago

10 comments

  • greesil 1863 days ago
    "The idea that matter and antimatter particles behave slightly differently is not new and has been observed previously in studies of particles containing strange quarks and bottom quarks. What makes this study unique is that it is the first time this asymmetry has been observed in particles containing charm quarks."
    • pytyper2 1863 days ago
      What is the difference and what was this information blocking us from discovering?
      • greesil 1863 days ago
        I'm not a particle physicist, I just copy and paste paragraphs.
      • crdrost 1863 days ago
        So we have these matter particles that are defined by three numbers, a weak hypercharge Y in the set {1, 1/3, -1/3, -1}, a weak isospin T in the set {1/2, -1/2}, and a ‘generation’ in the set {0, 1, 2}. The electric charge is a derived quantity Q = T + Y/2 that does not depend on generation. For generation 0 we have the particles,

              T     Y     Q    name
            -----|-----|-----|----------
            -1/2   -1    -1    electron
            -1/2  -1/3  -2/3   antiüp quark
            -1/2   1/3  -1/3   down quark
            -1/2    1     0    antineutrino
             1/2   -1     0    neutrino
             1/2  -1/3   1/3   antidown quark
             1/2   1/3   2/3   up quark
             1/2    1     1    positron
        
        The weak interaction has to preserve these quantum numbers, so for example when a free neutron (up-down-down) turns into a proton (up-up-down) and an electron, as it will do if you leave it alone for about 15 minutes, then one of the down quarks is turning into an up quark and an electron. This preserves electric charge but it does not preserve the underlying quantum numbers, so it requires emitting an antineutrino. [The fact that you need 4 particles total is part of why it takes a long time on the order of minutes; in this case the Feynman diagram vertexes only have three lines going in/out and so creating a 4-particle state requires two of them, in the middle you have a W- boson, (T=-1, Y=0).]

        And then there are some things which are not present but fit the quantum numbers, for example many grand unified theories predict something which is spectacularly unobserved called “proton decay” where an up (1/2, 1/3) could hypothetically annihilate with a down (-1/2, 1/3) to generate an antiüp (-1/2, -1/3) plus a positron (1/2, 1)—this would manifest as a proton decaying into a neutral pion (up-antiüp) plus a positron, which would presumably be hugely energetically favored (protons have several times the mass of pions and elecron/positron masses are negligible)... this sort of decay does not have a way to happen in the standard model because there is no interim (0, 2/3) particle to sit between the two vertices.

        Anyway, the next generations up are basically copies of the same 8 matter particles, with "electron" replaced by "muon" and then "tau", "neutrino" replaced by "mu neutrino" and then "tau neutrino," "down" replaced by "strange" and then "bottom", and "up" replaced by "charm" and then "top". The down and up quarks typically cannot decay into anything without some antidown or antiüp quarks sitting around to annihilate with them, though again, this is not 100% obvious from the table above, as the case of proton decay shows. So that we have observed this with strange and bottom quarks is two out of our four possibilities.

        So what this makes very clear is that the CP-violations are not something specific to the (-1/2, 1/3) / (1/2, -1/3) antiparticle pairs that are called (anti-)down, (anti-)strange, (anti-)bottom in the three generations. It is not some sort of physics phenomenon that requires these two signs to be opposite; it has now been observed in the (-1/2, -1/3) / (1/2, 1/3) antiparticle pairs, too. Assuming that the presence in the bottom quark means that this asymmetry crosses generation lines, then we are all but assured that the much harder to measure top quarks would also display the asymmetry, and it is something very fundamental, rather than some as-yet-unappreciated aspect of the coupling of isospin to hypercharge.

        • sprayk 1863 days ago
          Thank you so much for this. Not only did this description help me understand what this discovery actually meant, it clarified and tied together a lot of what I have been trying to learn through occasional reading and youtube videos on the subject. The table of particles and associated values and the description of the rules of decay bridged a huge gap I've had in understanding all of this for a really long time.

          When you say "The fact that you need 4 particles total is part of why it takes a long time on the order of minutes...", does it take a long time because there are significantly fewer decays (described by Feynman diagrams?) from a lone neutron that result in a proton, anti-neutrino, and electron than there are decays that end up back at a neutron?

          • crdrost 1862 days ago
            So like it wouldn't be a decay if it went neutron → neutron, if that makes sense. There is one “main” diagram which goes neutron → neutron and it looks like a straight line with no vertices and it is by far the most probable thing, most neutrons just stay neutrons.

            So there are two reasons that a free neutron outside of a nucleus takes so long to become a proton, and you can kind of visualize it like pulling a molecule of air through an air filter or so, the first reason that this particular setup takes so long is that this particular air filter is really thick, and the second reason is that the fan you're using is not very strong.

            The “wall being thick” has to do with this intermediate particle, and that’s what I was alluding to above. The wall is thick because you need to create this W- boson. The problem is that this boson has about twice the mass of the neutron itself, call it Bohb because it’s a Big Ol’ Honking Boson. There's just nowhere near the energy in the system to create this thing directly. And in quantum mechanics that is okay because quantum systems can “tunnel” through states that they cannot directly actually occupy: but it generally takes longer and longer the more and more energy you need to borrow, and this is a lot of energy to borrow.

            The other thing is the weak blower, and that has to do with what “pressure” or “energy difference” drives the decay. In this case the driver is the mass difference: down-quarks are just intrinsically about 2 MeV heavier than up-quarks and that is enough to cover the 0.5 MeV of an electron and a neutrino., so you have something like 1.5 MeV left over to spread across the universe. By itself that number doesn't mean anything, though—what means something is the ratio of the initial to the final masses, which is something like 939.57 MeV : 938.78 MeV, so the final mass is only 0.08% lighter than the initial mass. The reaction rate goes like some high power—a fifth or sixth power—of this ratio, so when one side has like half the mass of the other side then the reaction happens very very fast because there is so much pressure driving it. But in this case the masses are so close to equal that the reaction takes something like hundreds of times longer than you might otherwise expect from just the thickness of the barrier alone.

        • abakker 1863 days ago
          This is definitely one of the greatest responses I’ve seen here in the last 5 years. Thank you!
          • crdrost 1862 days ago
            Thank you for your kind words!
            • pytyper2 1855 days ago
              Just getting back to follow up on my question now, and I have to agree, this was very thoughtful. If hacker news had the equivalent of reddit gold I would give you some.
  • Maro 1863 days ago
    Quote:

    These observations have confirmed the pattern of CP violation described in the Standard Model by the so-called Cabibbo-Kobayashi-Maskawa (CKM) mixing matrix, which characterises how quarks of different types transform into each other via weak interactions. The deep origin of the CKM matrix, and the quest for additional sources and manifestations of CP violation, are among the big open questions of particle physics. The discovery of CP violation in the D0 meson is the first evidence of this asymmetry for the charm quark, adding new elements to the exploration of these questions.

  • saagarjha 1863 days ago
    Unrelated, but

    > Precision studies of antihydrogen atoms, for example, have shown that their characteristics are identical to hydrogen atoms to beyond the billionth decimal place.

    the wording for this is a bit ambiguous, since it’s not clear if the decimal place is for the billionths or if it’s the actual billionth place to the right of the decimal point.

    • lmilcin 1863 days ago
      I guess to one billionth, because nothing can be measured to billionth decimal place.
      • hnarn 1863 days ago
        > nothing can be measured to billionth decimal place

        surely that depends on the unit of measurement :-)

        • lmilcin 1863 days ago
          No, actually it does not. Universe that we have does not have anything that would vary by more than couple dozen of orders of magnitude.

          Only about 10^247 of instructions could have been performed by a computer since beginning of our universe (if the entire universe was converted into a computer and it would do calculations in theoretically optimal way).

          The observable universe volume to planck volume ratio is only of the order of 10^185.

          247, 185, that's still long way to 1000000000 orders of magnitude, whatever way to look at it.

          Even if you decide your unit of measurement of volume is the volume of entire observable universe then the planck volume, which is orders of magnitude less than any particle volume, is still just 10^-185. That is just 185 decimal places, far from billion decimal places.

          Or, in other words, if you like something more tangible: 0.00000000000000000000000000000000000000000000000000 00000000000000000000000000000000000000000000000000 00000000000000000000000000000000000000000000000000 0000000000000000000000000000000001

          This is 1 at 185th decimal place.

          If I did this for 1 billion decimal places the entire document would be gigabyte in size and you would spend a bit of time downloading and scrolling it, assuming HN would first allow me to upload it.

          • repsilat 1863 days ago
            > Even if you decide your unit of measurement of volume is the volume of entire observable universe then the planck volume

            Sure, but if your unit of measurement is (for whatever reason) 10^billion times the observable universe that statement holds.

            I don't think any amount of pedantry will let us distinguish a billionth significant figure though.

          • perl4ever 1863 days ago
            Suppose that you are considering a particular one of the possible permutations of planck volumes in the universe, of which there are about (10^185)!.

            According to Stirling's approximation, (10^185)! is more than 10^10^187, which makes a billion decimal places look very small indeed.

            • lmilcin 1863 days ago
              Well, you are just throwing random numbers.

              The topic is an article saying "Precision studies of antihydrogen atoms, for example, have shown that their characteristics are identical to hydrogen atoms to beyond the billionth decimal place." This suggest the "characteristic" is some kind of physical property that can be measured and compared.

              • perl4ever 1863 days ago
                Well, we all know that "the billionth decimal place" is an error. Nevertheless, my point was that far larger numbers can be related to physical things.
      • saagarjha 1863 days ago
        Yeah, I’m sure that’s what they meant, but I still feel like it could have been worded better…
        • dhimes 1863 days ago
          Agree. Also, "billion" is ambiguous across countries. So, 10^{-9} would have been great (or is it 10^{-12}?).
          • Navarr 1863 days ago
            In which countries is 10^12 still billion?

            My immediate research only renders that it used to be this in British English

            • kgwgk 1863 days ago
              In all the blue contries in this map:

              https://en.m.wikipedia.org/wiki/Long_and_short_scales#Curren...

              “The traditional long scale is used by most Continental European countries and by most other countries whose languages derive from Continental Europe (with the notable exceptions of Albania, Greece, Romania, and Brazil).”

            • lagadu 1863 days ago
              Every non-English speaking one, in my very anecdotal and limited experience. I've never seen it used like that outside the US and sometimes UK.

              No offense meant but it's usually shown as an example of the "Americans can't count" stereotype: 1,000,000,000 is a thousand millions.

              • jerf 1863 days ago
                Both systems are broken. The British system is a bit more nominally consistent, but the names are poor; million and milliard are stupidly phonetically similar for what they are (generally we prefix those differences, probably because it's the ends of words that tend to get slurred, hence we have milli-meters and not meter-millis), and staggering the names out two chunks of 3 at a time doesn't really make much sense. It isn't very user-focused.

                The American naming system uses roots more clearly and doesn't have the weird pointless bundling together of two groups of 3, but it has a bizarre off-by-one issue... and not the usual 0->1 or 1->0 issue, but a 1->2 issue. For consistency, the order ought to be "ones, millions, billions, trillions, quadrillions", so that the prefix on the digit counter indicates the number of factors of 1000 in question, from zero, one, bi=two, tri=three, etc. However, "thousands" get stuck in there wrecking the whole thing up, so where the names say you have 1 group of 1000, you in fact have two, and so on.

                It's not an imperial vs. metric sort of thing, it's more an arguing which is the "real" temperature, Fahrenheit or Celsius, when in fact the answer is basically neither because the "real" temperature scale ought to have its 0 at absolute zero, like Kelvin [1] or the lesser-known Rankine [2], which is basically "Kelvin, except the degree is 1 degree Fahrenheit". These are both more "real" because now you can add and subtract temperatures meaningfully, which you can't do with either of Fahrenheit or Celsius. And likewise, neither number system is abstractly all that great. But then, that's part of why we have scientific notation.

                [1]: Which I just learned is about to be redefined, as of May 20th, 2019: https://en.wikipedia.org/wiki/Kelvin#2019_redefinition

                [2]: https://en.wikipedia.org/wiki/Rankine_scale

                • ajuc 1863 days ago
                  What's wrong with the long scale? It's pretty consistent. Each name is the last times 1000, each new prefix is 1 000 000 times the previous one, and each -iliard is 1000 times -ilion with same prefix.

                      milion = 10^6, miliard = 10^9
                      bilion = 10^12, biliard = 10^15
                      trylion = 10^18, tryliard = 10^21
                      kwadrylion = 10^24, kwadryliard = 10^27, ...
                  
                  Compared to that American way is just insane (as always).

                  But I agree that writing 1.23 * 10^6 is preferable.

                  • jerf 1863 days ago
                    To put what I said another way, it emphasizes the wrong things. Nobody cares how many collections of 6 digits something has. 3 digits are what people care about. (Except where they care about 4, or actually vary the number of digit groupings they care about depending on where they appear, but then, this debate has no meaning to them anyhow.)

                    It's consistent, but it's consistent with something that doesn't match people's usage.

                    Both systems are roughly equally broken, so either side mocking the other for their number system is a display of parochialism above and beyond the usual levels one would see.

                    • toast0 1863 days ago
                      > Nobody cares how many collections of 6 digits something has. 3 digits are what people care about.

                      I respectfully offer the Lakh [1] (1,00,000) and the Crore [2] (1,00,00,000) for other things people care about. :)

                      [1] https://en.wikipedia.org/wiki/Lakh [2] https://en.wikipedia.org/wiki/Crore

                    • vorg 1863 days ago
                      > Nobody cares how many collections of 6 digits something has. 3 digits are what people care about. (Except where they care about 4

                      4 is a more common grouping than either 3 or 6. All of China and Japan and many places in between group their digits in fours.

                      And I always group the digits of hexadecimal numbers in fours.

                • afwaller 1863 days ago
                  This "off by one" issue, combined with the problem of words related to latin "mille" for thousand in many languages, is why in money people often use K for units of "1,000", MM for units of "1,000,000", and B or sometimes BB for units of "1,000,000" (BB has no meaning as far as I'm aware, people just copy the style of MM).

                  Unfortunately it is likely too late to really adopt the metric system for financial transactions. But the K has crept in for thousands.

                • lmilcin 1863 days ago
                  > million and milliard are stupidly phonetically similar

                  Hi. I live in Poland. We use "milion" and "miliard" which sound exactly as you would say them in english and somehow I never met anyone who would have problems distinguishing one from the other, phonetically. Soo... maybe not that stupid after all?

            • gnulinux 1863 days ago
              Most non-English languages borrow French word "Milliard" to say English "Billion". Some non-English languages use "Trillion" for English "Trillion" and some use "Billion" for English "Trillion".
            • AegirLeet 1863 days ago
              German:

              Million (10⁶)

              Milliarde (10⁹)

              Billion (10¹²)

              Billiarde (10¹⁵)

              etc.

            • csunbird 1863 days ago
              Very interestingly, In Turkey(Turkish) we use:

              10^6 : Million 10^9 : Milliarde (instead of billion) 10^12 : Trillion 10^15 : Quadrillion

            • vbarrielle 1863 days ago
              In french "un billion" is 10^12.
  • ainar-g 1863 days ago
    99.9999% is almost 5σ, right? Have we finally discovered New Physics?

    Edit: The CERN[1] article says that it is in fact 5.3σ.

    [1]: https://home.cern/news/press-release/physics/lhcb-sees-new-f...

  • japhyr 1863 days ago
    I am unfamiliar with the concept of CP violation, and this description was really helpful:

    https://www.nevis.columbia.edu/daedalus/motiv/cp.html

  • dukwon 1863 days ago
    This is a great result and a real milestone in particle physics. CERN is really ablaze with the news (seriously, there was a fire: https://i.imgur.com/i8nkPMR.jpg)

    Stay tuned for more high-profile LHCb results at Moriond tomorrow and on Tuesday.

    • antonvs 1863 days ago
      > CERN is really ablaze with the news (seriously, there was a fire

      Will you be here all week?

  • lelf 1863 days ago
  • westurner 1863 days ago
    So, does this disprove all of supersymmetry? https://en.wikipedia.org/wiki/Supersymmetry
  • adrianN 1863 days ago
    It would be very exciting if the Standard Model can't explain this.
    • lelf 1863 days ago
      In summary, this Letter reports the first observation of a nonzero CP asymmetry in charm decays, using large samples of D⁰ → K⁻K⁺ and D⁰ → π⁻π⁺ decays collected with the LHCb detector. The result is consistent with, although at the upper end of, SM expectations, which lie in the range 10⁻⁴–10⁻³ [8–13]. Beyond the SM, the rate of CP violation could be enhanced. Unfortunately, present theoretical understanding does not allow very precise predictions to be made, due to the presence of strong-interaction effects which are difficult to compute. In the next decade, further measurements with charmed particles, along with possible theoretical improvements, will help clarify the physics picture, and establish whether this result is consistent with the SM or indicates the presence of new dynamics in the up-quark sector.

      from the paper (links in my other comment). SM = standard model.

      • gnulinux 1863 days ago
        I think a layman summary of this is "it seems consistent with Standard Model, although at the upper end of it, but we're not sure yet."
    • T-A 1863 days ago
      https://en.wikipedia.org/wiki/CP_violation#CP_violation_in_t...

      The only news here is the technical achievement of measuring the effect in charm quarks, which is harder to do than in kaons (1964) and B mesons (2001).

    • dukwon 1863 days ago
      The Standard Model explained it in the early 1970s
  • fopen64 1863 days ago
    SJWs are going to protest.
    • AndrewStephens 1863 days ago
      I don’t understand anything about quantum physics and yet somehow your comment is more bewildering than the article. WTF you on about?
      • ionwake 1863 days ago
        It was uncalled for, but as an explanation I think the premise of the comment is that asymmetry is the opposite of equality
    • sctb 1863 days ago
      Please don't keep posting unsubstantive comments like this, eventually we ban accounts that do.

      https://news.ycombinator.com/newsguidelines.html