Up In Smoke
Joseph Priestley was a pioneer in chemistry who shares the credit with Antoine Lavoisier for the discovery of Oxygen. Before Oxygen was discovered — along with an explanation of its crucial role in combustion — chemists generally believed in a now long-forgotten Phlogiston Theory.
too much smoke! This puzzling observation led some to suggest that Phlogiston must have negative mass. Of course the extra mass that Lavoisier found in the smoke was the mass of Oxygen.
Once Lavoisier worked out the mathematics for his Oxygen Combustion Model, most chemists abandoned the Phlogiston Theory. One might say the Phlogiston Theory itself went up in smoke.
But not quite. A number of chemists —notably including Joseph Priestley — clung to the Phlogiston Theory to their dying day. Was Priestley reluctant to concede that Lavoisier had not only joined in the discovery of Oxygen, but also discovered and developed the better scientific theory of the role of Oxygen in the combustion process?
Oddly enough, the Story of Phlogiston and the Excess Smoke is very similar to the Story of Cold Fusion and the Excess Heat. The so-called "excess heat" of Cold Fusion is very much like Phlogiston — a problematic (and demonstrably incorrect) way to account for the gozintas and gozoutas. In Cold Fusion some of the heat from the apparatus escapes without passing through the device that carefully measures heat flow. How do they account for this heat that sneaks past the meter? They use a complex mathematical model that has a term for every imaginable way some heat can escape undetected. The smallest error in that complicated mathematical model can throw off the estimate of the unmeasured or unaccounted fraction of heat in the overall system.
Like Antoine Lavoisier, who carefully captured all the smoke and carefully assayed it, modern replications of the Cold Fusion experiment are finding that the "excess heat" isn't really there. Rather it was an artifact of subtle errors in the complicated mathematical model that purportedly accounted for the unmeasured heat that bypassed the meter.
And so Cold Fusion, like Phlogiston, goes up in smoke.
And yet the Fusioneers, like Joseph Priestley, will continue to believe in their fabulous pipe dream until their dying day.
28 Comments:
"complex mathematical model"
As a biologist, I've made use of mathematical models. My approach was to collect experimental data and then try to find the most simple set of equations that would "fit the data". The mathematical model then became a predictive model that could motivate new experiments and the collection of new data that might or might not be found to be compatible with the model.
Among physical scientists there seems to be tendency towards belief that nature is fundamentally mathematical. Rather than use mathematics as a tool for aiding human understanding of the world, some folks flirt with the idea that the world is, ultimately, mathematical. This can sometimes leads to a strong faith in mathematics as a research tool.
I am reminded of Lord Kelvin, who used a set of assumptions and some mathematics to calculate a relatively young age for the Earth. We must always be careful to test our mathematical models against new types of experimental observations. The more complex the model, the more it needs to be tested.
In this case, the complicated formula had to include a term for every imaginable mundane pathway for "excess heat" to arise or vanish without being carefully metered. If some mundane pathway were not accounted for, or if the coefficients on some term ware a little off, the whole model for calculating "excess heat" was thrown off kilter. Kirk Shanahan estimated that an error as small as 1-3% in estimating the fraction of heat that arose or vanished without being measured or otherwise accounted for was enough to lead the foolhardy fusioneers to erroneously conclude they had found excess heat from fusion. What they really found (but evidently didn't realize) was an error in their arcane mathematical model.
http://en.wikiversity.org/wiki/Cold_fusion/Contrary_evidence/Dash-Zhang_Replication_Effort
Barry, you are basically clueless. Shanahan's alleged CCS only affects, to any serious degree, some types of calorimetry, and heat/helium correlation eliminates calorimetric error as a serious factor in the fusion conclusion. Do you imagine that results showing greater than 100% excess heat, and there are plenty of them, are coming from a "calibration constant shift"? That 7000% excess heat, blatantly visible, is ... what?
The workers in the field stopped trying to prove that it's fusion maybe a decade ago, they aren't arranging their experiments for that purpose, they are attempting to explore the parameter space. The proof already exists, it's conclusive. And they -- and I -- don't care at all if you believe or don't believe. We aren't asking for your money. You want to make yourself look like an idiot if anyone looks back at this in a few years, suit yourself. Be my guest.
But don't pretend that you understand and practice the scientific method. Your mind froze too soon, all you can do is repeat the same memes over and over.
Chorus: Over and over and over and over.
You lost me, Lomax. I don't follow your abdence and reasoning.
Abduction, unless one gets very lucky, requires a general knowledge of the topic, if the body of evidence. If you don't have that knowledge, abduction will indeed seem like a "flight of fancy." When a communication problem is due to a lack of overall knowledge, attempts to provide the knowledge in a single piece of text will seem overwhelming to the reader. However, if the reader will chunk it, take one piece that seems puzzling or unwarranted, and ask about that specific piece, and explore it, the piece can be resolved and understood. However, if one just objects to the overall presentation, or picks one piece after another to attack, communication failure is guaranteed.
Barry, do you think I don't understand the skeptical arguments? Let's get that out of the way first!
Captcha: sitable. Able to sit.
Abduction doesn't require much knowledge at all. It only requires imagination.
Einstein said, "Imagination is more important than knowledge."
That's because a theorist first has to imagine a plausible model before he can propose it and then rigorously test it.
"Barry, do you think I don't understand the skeptical arguments?"
What I reckon you don't understand, Abd, is the role of technical models, of the sort that Lavoisier was capable of developing and testing.
A good scientific model embodies the five epistemic characteristics of scientific knowledge: that ideas represented in the form of models are 1) testable, 2) revisable, 3) explanatory, 4) conjectural, and 5) generative.
Your abductive flights of fancy are conjectural, but are they testable, revisable, explanatory, or generative?
Do they make testable predictions? Do they yield insight that generates better experiments, leading to ever better models with better explanatory and predictive power?
But I think I understand that model of science very well, and I'm using it.
You are correct that abduction doesn't require knowledge. But successful abduction normally requires knowledge. It is not "deduction," rather it "imagines" some theory; if the theory succeeds in organizing the held knowledge (often held outside of direct consciousness), there is an aha! moment. And then the work of testing and the rest begins.
You ask three questions at the end. The answers are Yes, Yes, and Yes. Now, are you going to abduce that I'm just bullshitting you, or are you going to follow the logic? This, indeed, is my test of my hypothesis that you aren't interested in following the scientific method as applied to your own imaginations, only in demanding that others follow it with what you think are their "flights of fancy."
You can call this hypothesis abductive, i.e., a flight of fancy, but maybe it's closer to deduction from an accumulating body of evidence. How would we know the difference?
You can demonstrate the falsity of the hypothesis by continuing with or creating counter-examples that show the persistence suggested by Feynman, the diligent effort to avoid fooling oneself.
It doesn't matter to me what course you choose; you are the one who will have to live with your own condition. Good luck.
Captcha: ouski. this is a cute pseudo-ethnic way of saying "Ouch!" How about "Ouski, Tchaikovsky? Watch for it on a wiki near you.
You imagined that the cells used in the EarthTech replication of Zhang's cells (with a Seebeck Envelope Calorimeter), and more recently in Zhang's own cells used with his new enveloping Heat Flow Calorimeter (HFC) were "dead cells" (because they did not show any "excess heat" as found in the earlier style of calorimeter that relied on the controversial Miles-Fleischmann Model to account for otherwise unmetered heat that bypassed the precision HFC in the heat sink at the mouth of the Dewar flask).
How is your post-dictive "dead cell" hypothesis consistent with the scientific method in which hypotheses are required to make testable predictions?
"Dead" is the language used for a cell that doesn't develop the FPHE. The recent Zhang report you cite was for a cell that, in his ICCF15 presentation, he called "dead." Dead cells are common; in the ICCF presentation, he shows how many cells in the full experimental series were "dead." Of four samples, the proportion of active/total runs were 21/35, 6/7, 0/3, 0/5.
The calorimetry is showing a difference between the samples. Some produce heat under electrolysis, some don't. It's somewhat understood, ENEA is apparently good at making palladium rods that work.
"dead cell" is not a hypothesis, though, technically, it means an "apparently dead cell." it means that, if anomalous heat was generated, the calorimetry missed it. This has become typical for you, Barry, that you assume some unstated hypothesis. "Dead cell" is not explanatory, in itself. It's an observation.
The FPHE effect has appeared with many different kinds of caloimetry. It is often drastically higher than likely calorimetry error (much more than 5% excess energy.) You seem to trust Zhang's calorimetry when it shows a dead cell, but not when the same calorimetry shows an active cell. 'Splain this thing to me!
"Dead cells are common."
Let me see if I have this straight. An experimenter runs a bunch of trials. Some come out with a positive value of "excess heat." Some come out with no excess heat (or even negative "excess heat") and these are called "dead cells."
The only way to distinguish i "excess heat"s to run the experiment and then, after the fact, pronounce it "live" or "dead" depending on whether or not there was "excess heat."
And there is no theory to explain the phenomenon of "dead cells" or to predict them in advance.
It occurs to me, Abd, that there is considerable variability and error in the measurement of so-called "excess heat" and they are systematically declaring the cells with zero or negative "excess heat" to be "dead cells" and not diagnosing the true explanation, which is that that are violating Feynman's dictum (to report all results, including negative results) and to consider that the negative results are statistically significant evidence for the null hypothesis (no effect and/or noisy measurements).
It's especially significant that Zhang was running his "successful" cells at or near the boiling point, which is pretty damn steampunky, if you ask me. And those are precisely the extreme conditions under which the Miles-Fleischmann Model can be expected to become an unreliable predictor of thermal leakage.
Take a look at the results from various palladium samples shown by Zhang at ICCF15. I'd really like to see the results for each sample plotted vs the run number. I'd like to see much more than that! But what Zhang shows is typical: samples that work once tend to work again, samples that don't work at the beginning, tend to not work later.
Your comment was radically sloppy, from what you said, I'd infer that the new calorimeter was showing different results from prior work by Zhang, or others, using different calorimetric methods. No, it might be more accurate, but not qualitatively different.
Second, Earth Tech did not exactly replicate Zhang, except for maybe one cell, and some of Zhang's cells also showed no heat, even with good palladium. You are drawing conclusions from far too little evidence, and not attempting to falsify your own hypotheses.
"It occurs to" you that electrolytic cold fusion cells, are [i]not reliable[/i]. Next you will notice that the Pope is Catholic, or other shocking news.
Codeposition has a reputation for being reliable. Some disagree, and I, frankly, don't know.
Fortunately, we don't need to have reliability to show that it's fusion, because there is a measurable outcome that varies with true excess heat: helium.
We also don't need reliability to demonstrate, even aside from helium, that there is excess heat.
You've gone completely off your rocker with this "dead cells" term. Really, what "dead" is used for is cathodes. And all it means, really, is cathode material that doesn't show the excess heat phenomenon.
Originally, the only way to know the difference was to test the material!
I think you may have the idea that CF calorimetry is down near the noise, and that, then, positive results are being cherry-picked. That's a legitimate concern, but the best work definitely reports on all experimental cells. Notice that Zhang appears to do so in the recent work described at ICCF15. Miles certainly did that; it appears that 12/33 cells were "dead."
Significant excess heat is well above calorimetric error, unless there is some systematic error. It's been claimed that unexpected D2-O2 recombination is a source, but that is in contradiction to the substantial body of evidence that shows periods of sustained power production that are greater than D2/O2 combination could explain.
Do remember that these workers are generally chemists! And that they are frequently senior researchers, highly experienced.
All that "dead" means is that excess heat was not significant. I've never seen a report of significant negative excess heat. A dead cell may show negative excess heat due to normal calorimetric error. And that's what Zhang's "negative" excess heat report was, just look at the stated error limits.
"I'd really like to see the results for each sample plotted vs the run number."
Yes, we need to see all the data, not just the data for selected runs.
Here is my specific hypothesis, to explain why some runs are "successful" and some are not.
As you know, Zhang is often running his cells at or near the boiling point, so that there is a substantial amount of water vapor and condensed steam in the atmosphere. We also learn (from the EarthTech report) that there is some venting, when the pressure exceeds some safe level (e.g. 800 torr).
Venting removes some variable mix of atmospheric components — some mix of D2, O2, and D2O. Moreover — and this is crucial — the D2O is some mix of gaseous water vapor and liquid water droplets (steam or fog). When they measure the mass of materials bled off, they have to know the fractions that are D2 gas, O2 gas, D2O gas and D2O liquid phase (as fog or steam). As you know, there is a huge difference in energy between water in the liquid phase and water in the gaseous phase.
In the Miles-Fleischmann Model, the Pgas term has to account for this mix of vented steam. But how can the experimenter determine the composition of the vented steam cloud, to plug into the Miles-Fleischmann Model? If the estimated fraction of vented D2O in the vapor phase is too high (there was, in actual fact, more moisture in the liquid phase), the Miles-Fleischmann formula will under-estimate the heat bled off. Conversely, if the estimated fraction of D2O in the vapor phase is too low (there was, in actual fact, less moisture in the liquid phase), the Miles-Fleischmann formula will over-estimate the heat bled off. At the extreme temperatures in which Zhang runs his "successful" outcomes, there is more steam and a greater chance of venting D2O in the liquid phase (as steam).
I frankly don't see how Zhang (or anyone else) can measure the vented steam and precisley resolve the components into the correct fractional terms to plug into the Miles-Fleischmann formula for each species in the vented steam.
Errors in accounting for the precise fraction of vented steam carrying away moisture in the liquid phase would produce corresponding errors in the computation of excess heat from the Miles-Fleischmann Model.
Arggh. The software dumped a detailed comment that covered the issues raised. Summary: Calorimetry error due to error in the estimation of lost mass proportion of deuterium is limited to about 0.2% of input energy, in the extreme. That should be compared with the roughly 5% of input energy probably being shown by the Zhang cells, based on prior reports. The ICCF15 slide presentation doesn't provide the needed data to estimate excess energy percentage. The slides are inadequate to establish overall excess energy, but Zhang does give a figure for one experiment of 2.46 kJ, and input power is probably around 280 kJ. 1% excess power may or may not be significant, depending on details not shown. The calorimetry error is around 0.33 kJ, he reports, for that experiment, or roughly 0.1% of input power. So the result appears significant if his calculations were correct. Lost deuterium would presumably add to this power, but only a little.
"Steamy" is irrelevant with this calorimeter, which captures the heat from the steam. The only inaccuracy, that I can see, is from two terms mentioned: percentage of deuterium lost -- and this correction would increase excess heat, and correction from the wetting of recombiner. Earth Tech estimated 2.76 kJ for this, in their work, but that would presumably balance out when the recombiner dries during the shutdown period. Certainly this is something to be concerned about, but drawing conclusions about other cold fusion work from this is very hazardous. In closed cells, if you get recombiner failure, you get an explosion or cell failure (I'm sure that cells are now designed to vent with overpressure, and pressure is monitored.)
Absolutely, unoxidized deuterium is a variable that requires careful attention, and the work of Miles and Fleischmann should be reviewed to see how they handle this. It has not been ignored, as you have implied.
Nor is "modern work" finding no excess heat; it has been, in fact, confirming excess heat, which is well above possible calorimetry error. You've simply shown a report where that wasn't the case. You can find more of those, if you want. They are not particularly common, but given that dead cells are a very common phenomenon, there will be lots of those, still, if you look at the complete data.
Captch: weredgie
You have the most fabulous Captcha words. This is better than the I Ching. How did you arrange this?
Significant excess heat is well above calorimetric error, unless there is some systematic error.
A number of systematic errors have turned up. It occurs to me that the most likely systematic error is in the Pgas term of the Calorimetry Model, because that term has to account for heat lost from venting, as well as heat lost from leakage through the walls of the Dewar flask in the headspace. If I were modeling the Pgas term, I could see the need to model the mix of gases and condensed moisture in the atmosphere, since that mixture affects both thermal conductivity and heat carried away by venting. But having modeled it, I don't see any reliable way to measure that mix in real time during an experimental run.
Barry, I'm becoming less and less impressed with your ability to read and retain what is in these experimental reports. There is no Dewar flask, for example. This is an envelope calorimeter, it doesn't need one. The only correction this calorimeter needs is from unrecombined deuterium that escapes, and that is, from the mass loss, a very limited term, and it will decrease reported excess heat, not increase it. There is no substantial effect from vented moisture, as the dead cells show. Remember, those cells are also "steam baths."
The level of mass lost is well within the capacity of the outer chamber to vaporize at operating temperature, if it's all water. Take a closer look.
Look, if it were considered important, I've suggested a simple fix: an additional, heated recombiner, perhaps platinum mesh, that the vented gases from the experimental cell must pass through; the water vapor would then be condensed with a separate condenser, and would drip back into the experimental cell. But, you know what? These workers are experts. I'm not. I may be frustrated sometimes by what they don't state, but they do generally know what they are doing! You seem to assume that they don't, that the simplest ideas would not occur to them. Unlikely. They've been doing this for twenty years.
There is still a sealed containment vessel that has a safety vent for when the pressure rises.
Recall there was a correction for wetting the recombiner, and also one for water leaking out some porous tubing.
Once Zhang built an envelope calorimeter, half of his experiments showed excess heat, and half did not. You need to see the whole normal distribution for all runs, not just the "successful" half on the right side of the mean.
I'm not frustrated by what they don't say. I'm informed by it.
It occurs to me they know exactly what they are doing when they decide what not to disclose.
Barry, you still are confusing the cell itself, which vents if the pressure is high enough, which only happens from recombiner failure, and the calorimeter.
Remember, Zhang's results only represent his particular approach to exploring the FPHE. I have no investment at all in that approach working. He may or may not be getting significant excess heat; and even before I looked at the recent results, I was commenting that these were not spectacular, they were much closer to the noise than, say, McKubre's recent Energetics Technologies replications.
As to what Zhang is revealing, when you can't win an argument, attack the integrity of the researcher. Was that your intention with regard to Zhang's motives?
The way you have described the results is naive, but I'm not bothering to explain. I've limited time.
What's your theory as to why Zhang didn't publish the numbers for both halves of the distribution?
And it's not just Zhang who is cherry picking the outcomes to report.
It seems to be a recurring theme in CF research.
It could look that way to a schmuck from Boston, who gets his science from the newspaper.
Consent to action research is assumed from the ancient turnabout principle.
It could look that way to a schmuck from Boston, who gets his science from the newspaper.
If you're referring to Milton Roe, he's from Los Angeles.
Milton Roe made some really excellent comments, and his information was merely incomplete, so, yes, he also gets his science from the newspaper, he was making some blatant assumptions about the "twenty years of research," without actually reading it or reading a review of it. Quite like you, but with a better theoretical grasp of the situation. Thanks for reminding me of that discussion, there are some great points raised. I think I'll cover it and point this out to Milton.
You'll probably have a bit of trouble drumming up much interest in this subject over on W-R, but by all means, give it a shot.
Nah, I won't do it there. Bad idea. Wikiversity, probably, where something can be built and organized.
I doubt Milton Roe will agree to help you develop CF resources on Wikiversity.
But you might ask him if he agrees that you lose more body heat by evaporating a gram of sweat than by pissing away a gram of pee.
Post a Comment
<< Home