-2

If a splitting atoms / fusing isotopes (fission bomb, fusion bomb) yields more energy than chemical changes (TNT, et al) yields more energy than physical change (hydrogen bonds forming during water condensation), then why doesn't the trend continue BELOW the atomic level, or do physicists have no clue about that stuff yet? I know we cannot examine things more elementary than atoms, but are there any models or theories yet?

Qmechanic
  • 201,751
orokusaki
  • 421
  • Well, technically, the energy gained by combining three free quarks into a proton is infinite... – Zo the Relativist Aug 17 '12 at 15:16
  • 3
    Einstein's famous $E = mc^2$ equation sets the upper bound for the amount of energy contained in a particular amount of mass - in a nuclear reaction, mass is "converted" to energy.

    The highest possible energy yielding reaction for a particular amount of mass would be annihilation of equal amounts of matter and antimatter - this would "convert" all mass in the system into energy. So the energy contained in a system of finite mass is limited due to special relativity - independent of the type of change that is occuring.

    – user758556 Aug 17 '12 at 15:19
  • 3
    Your "knowledge" about not being able to examine things smaller than atoms is incorrect. Particle accelerators do so every day. – AdamRedwine Aug 17 '12 at 15:23
  • @AdamRedwine - we don't examine them, rather their effects. If you know otherwise, I'd like to see a few photos of your favorite quarks, or perhaps you could tell me about the experiences of those who have felt them, or in any way otherwise experienced them with their 5 senses. – orokusaki Aug 17 '12 at 15:28
  • @user758556 - can you put that into an answer, and perhaps elaborate a bit more about why mass energy equivalence (the theory) precludes the possibility of there being more mass in an object than we can observe at the atomic level. – orokusaki Aug 17 '12 at 15:40
  • (in my last comment, I'm attempting to suggest that or ask whether E=mc2 need not having anything to do with atoms, specifically, meaning that the potential nuclear yield of a given portion of U235 might not be representative of the amount of energy contained therein) – orokusaki Aug 17 '12 at 15:54
  • 3
    @orokusaki - What makes you think that observing "their effects" is different from "examining things"? When you look at something, you do not see the thing itself, you see the effect the thing has on the photons that are incident on it. Or, rather, you see the photons and your brain infers the effect the object has. Etc., etc., etc. – AdamRedwine Aug 17 '12 at 18:37
  • @AdamRedwine - Yeah, just like Cern examined faster than light particles, or Higgs Boson? And, then... realized they didn't. You see, there is a degree of separation between anything and everything, but if the only degree of separation is a microscope and your own nervous system, one can more confidently conclude that one has "examined" something :) – orokusaki Aug 17 '12 at 23:43
  • @orokusaki, and what of the homonculus theory of sperm? http://en.wikipedia.org/wiki/Homunculus Or the vast network of canals on Mars? http://en.wikipedia.org/wiki/Martian_canal Do they sound reasonable to you? – AdamRedwine Aug 18 '12 at 02:05
  • Besides, I trust my brain far more than I trust my nervous system and I trust logic far more than I trust my brain. I had a grandfather who had several strokes. His nervous system told him that objects would float away when he looked at them. He could see them floating... should he have trusted his observation? – AdamRedwine Aug 18 '12 at 02:09
  • 1
    @orokusaki We measure the mass, intrinsic angular momentum (i.e. spin), charge, weak hypercharge, and parity of quarks, leptons, mesons and baryons. We observe them. In the case of many of them we can follow their tracks in our detector for centimeter or meters. We observe them. Full stop. – dmckee --- ex-moderator kitten Aug 18 '12 at 08:10
  • @AdamRedwine - You trust your brain more than your nervous system? If memory serves me, your brain is kind of the core of your nervous system. – orokusaki Aug 19 '12 at 20:02
  • @dmckee - AdamRedwine - My seemingly cynical comments are indented to be thought provoking. Science is about questioning what we know, not assuming we know everything. I defer to CERN's recent "discoveries", again, and the reason I ask for a proof that can be directly witnessed by my 5 senses is because when our senses provide extraneous and/or erroneous data we can use interpersonal communication (as a control) to rule it out as anomalous (e.g., the case of the floating objects after a stroke). How can we determine anomalous behavior in tools which rely on data without controls? – orokusaki Aug 19 '12 at 20:10
  • 1
    @orokusaki It wasn't CERN, but OPERA that reported on the anomalous neutrino velocity results, and even they didn't believe it was real. If you had read their paper you would have known that. They'd been sitting on that number for six months while they tried to figure it out and only when public when it leaked. – dmckee --- ex-moderator kitten Aug 19 '12 at 20:51
  • 1
    @dmckee - Ha, if you'd read it, you'd have know that OPERA was the name of the experiment, by CERN, not some other agency - http://public.web.cern.ch/public/en/spotlight/SpotlightCNGS-en.html – orokusaki Aug 19 '12 at 21:02
  • 2
    A few words about how particle physics is organized and works. There are labs (which is what CERN is) and there are experimental collaborations (such as OPERA). Both are funded mostly by national funding agencies. Labs provide infrastructure. Experiments do physics. Lab scientists are heavily involved with experiments which is acknowledged by making them part of the experimental collaboration. The paper was issued by OPERA. Had it been real the credit would have gone to OPERA. (Labs do--quite properly--take some of the credit for every thing that experiments do with their infrastructure.) – dmckee --- ex-moderator kitten Aug 19 '12 at 21:48
  • And while the popular press often writes things like "Scientist at CERN reported that they had discovered the..." the only thing that demonstrates is that they do not know or care about the detailed organization of the field. – dmckee --- ex-moderator kitten Aug 19 '12 at 21:52
  • @dmckee - http://en.wikipedia.org/wiki/OPERA_experiment states that OPERA was a collaboration between CERN themselves, and 1 other agency. At this point, I'll leave you with a suggestion: Do some research before further posturing. – orokusaki Aug 20 '12 at 02:39
  • I'd like to point out that the scientific community is a wee bit more sure that the particle found at CERN was a Higgs boson than they are of the OPERA experiment. @orokusaki, does it really matter who did the experiments? – HDE 226868 Aug 18 '14 at 19:09
  • @HDE226868 If it doesn't matter who does an experiment, why don't I personally do an experiment in my kitchen sink to prove whether or not motor oil is made of oxygen and gold. I'll publish my results just for you. – orokusaki Aug 20 '14 at 14:49
  • You sound rather sarcastic. Incidentally, though, I think it would be interesting if you actually did do the experiment. In any event, I think I'll step out. – HDE 226868 Aug 20 '14 at 16:57

1 Answers1

1

Let me first express my agreement that in terms of pure energy I am in concurrence with the points made in the comments. That is, the energy availability from any amount of matter is $E=mc^2$ and thus, complete annihilation is the best you can do. We have no near-term energy sources that could get this limit, although energy extraction from black holes could. Energy from BH Hawking radiation would be 100% efficient conversion of matter into energy and would still probably be called nuclear power. To end the conversation at this point, however, would be showing a major lack of creativity, and I think wouldn't do justice to the true strangeness of the universe. That said, the rest of my answer will necessarily have subjectivity, because the argument in terms of energy availability alone is finished.

One could argue that energy reuse offers "more energy" in a certain sense. This is a complicated and sometimes confused argument. For a conceptual tool, let me turn your attention to the concept of concentric Dyson Spheres. It has been argued (really just in science fiction) that each sphere would be able to use the waste radiation from the last sphere. Ergo, the power output from the star, $\dot{Q}$ is used in full by each shell. So if there are $N$ shells, one may naively argue that the total energy used is $N \dot{Q}$. The problem is, the more shells you have, the lower quality of energy each shell gets! Even if such a society uses space-age solar photovoltaic panels, they can not violate the fundamental Carnot efficiency limit. Formally, let me claim that the useful energy is less than the thermal energy, which can be directly related to Einstein's matter-energy equivalence.

$$\dot{Q}_{useful} < \left( 1- T_C/T_H \right) \dot{Q} $$ $$ \dot{Q} = mc^2$$ $$\dot{Q}_{useful} < \left( 1- T_C/T_H \right) m c^2 $$

This implies $\dot{Q}_{useful} < m c^2$. So even though we have changed the nature of the problem we're looking at, we're still not going to do better than the theoretical nuclear limit. In other words we haven't "beat" nuclear power. That doesn't mean there's nothing to the concept of "negawatts", but we have to change the problem yet again. If we venture into the softer sciences, our energy use is only to some particular useful ends. Discussions about the evolution of technology often focus on computation for instance. So let's say we're interested in floating operations per second. We can get so many computations per unit energy.

$$FLOPS = \alpha \dot{Q}_{useful} $$

As an alternative to unlocking more energy, or using more energy efficiently, we can improve the efficiency of the thing we're doing. A good subsequent question is then if there is a theoretical maximum to the value of $\alpha$, and I believe the answer is "yes". The Heisenberg uncertainty principle will require a minimum amount of energy for any deterministic calculation, else the calculation becomes probabilistic. This is the concept of quantum computers. If we could build quantum computers, they would be fundamentally limited by the energy available.

There is another argument as to why this $\alpha$ must have a maximum. Hopefully I can find this link later and add it later, but there have been actual demonstrations that showed computer memory can be used to store information and make decisions about a quantum system, causing this system to "violate" the second law of thermodynamics. The specifics are not terribly important for this answer, but the argument goes that if you could store information with infinite efficiency and could do computation with infinite efficiency, you could then, one way or another, become a Maxwell's demon. I find this to be a fairly convincing argument.

This brings me to another potential "answer", which is that any Maxwell's demon could eventually be configured to reuse energy indefinitely. This would violate the limits I've discussed so far, and violating those limits would functionally unlock infinite energy. Infinite energy is greater than any nuclear energy resource, QED. Obviously, the problem is that Maxwell's demon doesn't exist. Still not satisfied? Maybe you're still interested in finding a loophole. You wouldn't be alone.

Frank Tipler is a real scientist, although I'm not sure if what he writes as popular literature could be called science. I'll quote Wikipedia, because I'll be honest that even I don't understand this:

The Omega Point is a term Tipler uses to describe a cosmological state in the distant proper-time future of the universe that he maintains is required by the known physical laws. According to this cosmology, it is required for the known laws of physics to be mutually consistent that intelligent life take over all matter in the universe and eventually force its collapse. During that collapse, the computational capacity of the universe diverges to infinity and environments emulated with that computational capacity last for an infinite duration as the universe attains a solitary-point cosmological singularity – with life eventually using elementary particles to directly compute on,[clarification needed] due to the temperature's diverging to infinity[clarification needed]. This singularity is Tipler's Omega Point.

Ah ha! As per my arguments, computational capacity diverging to infinite would necessarily imply an energy source or "method of use" that surpasses nuclear power.

In short, the 2nd law of thermodynamics will ultimately prevent anything from surpassing a perfect nuclear source of energy, but maybe if threw the end of the universe, or the multiverse, or something else like that into the mix, there could be a loophole that would open up an even greater resource, but I'm pretty sure it would require redefining spacetime itself somehow.

Alan Rominger
  • 21,047
  • It was ok till you started quoting tipler. thats nonsense. cosmic singularity ha. –  Aug 18 '14 at 21:06