• 0 Posts
  • 33 Comments
Joined 2 years ago
cake
Cake day: July 20th, 2023

help-circle



  • Helion is a completely different technology vs tokamaks which is what you’re thinking of. They pulse the plasma to create brief bursts of pressure/heating/fusion. They do already have their seventh prototype machine operational so while we can’t independently verify their claims, it’s probably not all bluster.

    I have mixed feelings about their approach. They plan to use a deuterium and helium-3 fuel blend. That has a couple major advantages. Most of the reactions will be aneutronic and the energy is released in the form of highly energetic alpha particles and protons. The lack of a high energy neutron is a huge advantage for safety and longevity of a reactor. High energy neutrons are hard to shield from and they cause most materials to get brittle and weaken. Netrons are not good for personnel to be around and they can leave some materials radiactive making reactor maintenance/disposal costly. The other advantage is that since all the energy is released as kinetic energy in charged particles, they don’t have to try to absorb high energy photons or neutrons into a water blanket to drive a steam turbine. Instead, the kinetic energy results in an electromagnetic pulse that can be harvested by the same magnets that constrict the plasma to begin with.

    Sounds amazaing, right? So why doesn’t everyone use this approach? Helium is rare, but Helium-3 is especially rare, making up only about 20 parts per million of helium found in geologic deposits. So simply put, it is currently infeasible to use Helium-3 at scale. Helium-3 can be collected as a byproduct of breeding tritium for use in nuclear warheads. Enough helium-3 is produced for some demonstration reactors, but any real amount of demand will quickly outpace what the DOE produces.

    Helion plans on breeding their own Helium-3 in Deuterium-Deuterium reactors they will operate. However D-D reactions are not aneutronic. So all the materials lifespan/shielding/ maintenance nightmares that come with operating a nuclear reactor will still apply. That means operators will have to buy very expensive fuel from Helion indefinitely. Helion doesn’t exactly deny this drawback, but I really dislike how much they gloss over it in their public communications.

    Here’s a video tour of their test facilities that explains the basics of their approach. https://youtu.be/_bDXXWQxK38

    I’m inclined to think they’ve demonstrated enough results that they are likely to be able to build a working unit quickly, however, that would still be a long way off from creating any sort of sustainable supply chain that would be a viable option for anyone beside datacenters.
















  • I’m not saying normalization is a bad strategy, just that it, like any other processing technique comes with limitations and requires extra attention to avoid incorrect conclusions when interpreting the results.

    Because relative to the population density, there were 100 times as many sightings. Or what am I missing.

    If you were to attempt to trap and tag bigfoots in both areas, would you end up with 100 times as many angry people in a gorilla suit in the small town? No. You would end up with 1 in both areas. So while the tiny town does technically have 100x the density per capita, each region has only one observable suit wearer.

    Assuming the distribution of gorilla suit wearers is uniform, you would expect approximately 99 tiny towns with no big foot sightings for every 1 town with a sighting. So if you were to sample random small towns, because the map says big foots live near small towns, you would actually see fewer hairy beasts than your peer who decided to sample areas with higher population density.

    If we could have fractional observations, then all this would be a lot more straightforward, but the discrete nature of the subject matter makes the data imherently noisy. Interpreting data involving discrete events is a whole art and usually involves a lot of filtering.


  • Simple normalization does amplify signals in low density areas. If a person in a tiny town of 100 reports a bigfoot sighting and another person in an area with 10,000 population also reports a sighting, then with simple normalization the map would show the area with 100 people having 100 times as many big foot sightings per capita as the area with the population of 10k. Someone casually reading the map would erroneously conclude that the tiny town is a bigfoot hotspot and would in general conclude bigfoot clearly prefers rural areas where they can hide in seclusion. When the reality is that the intense signals are artifacts of the sampling/processing methods and both areas have the same number of fursuit wearers.