Copenhagen vs Bohm interpretation, continued
I had a discussion earlier in some thread I can no longer find, whoever participated will know. My argument was that there is no reason to make assumptions that will limit you in any way based on just accepting some scientific theory supported by empirical evidence as fact. I took Bohr's interpretation of quantum state issue as an example of uncertainty and was confronted with the Bohm interpretation. Well, I just ran into something hillarious that I thought I should share.
I am meddling with massive parallel programming and just happened to skim the modelling ability of a Blue Gene and in extension the Japan's earth simulator. There are architectural differences, but for a lay man let's just call them supercomputers that are capable of modelling infinitly complex systems. This sounds impossible, right? Well, it is actually true in a strange way that coincides with the Copenhagen interpretation of quantun states. The key is not computer power, but something called lazy evaluation, a technique everyone in computer science is introduced to almost the first month of the first year of study. All that lazy evaluation really allows you to do is evaluate the thing (X from here on) you want evaluated only when you request to do something with it, like for instance observe it(!).
Untill the request comes through, the value of X is only present in the rules of how it's supposed to be, not in actual physical presence in the memory of the computer. In fact, your program might even run all the way from start to finish without ever invoking X, and thus never really having it present, but it would still influence things around itself to some extent. You might already have realised that this is an extremely efficient way to model things, since you never keep more stuff in memory than needed, and your model will (theoretically) never fail either, as long as rules for things are provided. Even if it failed somewhere, you will never know - with lazy evaluation a failure in the past will be taken into consideration at the point of calculation. The moment you request X that is not there, it's computed and presented to you, as if it was always there, or the computation itself will change if there is a failure, and you will get a different X.
If you look at Copenhagen interpretation and the "collapse" to state in the moment of observation, the resemblance is striking to me. I really can not believe that I only made the connection when I looked at the earth simulator, which basically spells it out. Of course, I am not the first to run into this, people have been playing with the thought of a monadic universe (o/ Haskell ppl) for a while: http://blog.sigfpe.com/2008/05/life-in-lazy-universe.html.
So deterministic or uncertain? Why not both at the same time?
Logic is a systematic method of coming to the wrong conclusion with confidence.
- Login to post comments
At first look it appears to be related to how we conceive of things. Take the universe as a small example. Steady state, big bang, cyclic. It does not have to be any of the three. Should we ever figure it out it is likely to be a fourth thing we cannot conceive of at the moment.
We can conceive of a thing existing or not existing. We can conceive of rules for the universe and no rules where spirits make everything happen like we used to. We can not (very unsatisfactority) conceive of a thing coming into existence only when a rule is applied like opening the box and looking at the cat. We are slightly less uncomfortable with a property becoming fixed depending upon the rule being applied, position or velocity.
Perhaps instead of a cat the decay would trigger which measurement rule to apply so that until the box is opened both position and velocity are known and the third rule of opening and looking causes one of those to become unknown.
When we get to Copenhagen and computers we find an ordered process like rules cannot generate random events but can generate X should that be required. We can view the universe as ordered or random. We lack a third view which would be the real nature of the universe.
Perhaps some day we will develop a third view of the universe which requires neither rules nor randomness.
Or am I missing your point?
Jews stole the land. The owners want it back. That is all anyone needs to know about Israel. That is all there is to know about Israel.
www.ussliberty.org
www.giwersworld.org/made-in-alexandria/index.html
www.giwersworld.org/00_files/zion-hit-points.phtml
I see two huge problems with this.
1) If you used such a program to model the universe, you would need to have gravity. Gravity has range over the entire universe. So when you determine the gravitation of any one object, don't you have to "observe" and thus define the entire universe as well? If so you would define every object in the universe every time you evaluated any object.
2) There is still the problem of weather or not the cat is dead.
Edit: A third problem just occurred to me. If we are to evaluate and define the current state of an object upon observation, do we not also have to evaluate all of past states of the object since it was last observed? This leads to an infinite regress that necessarily defines the entire universe up to this state.
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
Maybe I'm setting up a false dichotomy, but I don't see room for a third option. There either is randomness to the universe or is not. A third option to me would seem to break the law of identity. It would be stating it is random and, at the same time and in the same respect, not random.
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
We don't know enough about quantum mechanics to know which interpretation (if any) is correct. Our understanding of QM is through a descriptive, and not prescriptive, model. For instance, between Joy Christian's model of quantum spin using Clifford algebra (which invalidates Bell's Theorem, and re-opens the door for hidden local variables), and the quantum gravity model of dynamical causal triangulation, we could have a strictly-deterministic world in which "uncertainty" is perceived due to the fractal structure of planck-scale space, rather than intrinsic uncertainty.
We do have some interesting speculation along these lines, of course. There's Seth Lloyd's idea of the universe as a natural quantum computer, which might tie in nicely with lazy evaluation. I'm not sure why lazy evaluation would be needed, though, as each qubit has exactly enough computation power to calculate itself.
Until we have a better fundamental understanding of the mechanisms of the planck-scale universe, there's little we can do but speculate. So far, we can't even do that without tripping over our own shoes (such as we seem to be doing with string theory).
This is one of my pet peeves -- people making wild conclusions based on our knowledge of quantum mechanics. QM is currently in the same state as biology before the discovery of evolution. Sure, we can do some good research, but it's going to be very meager, and quite likely wrong; and there's no way to have a real understanding of the processes involved.
Not that I'm knocking your speculation, as you clearly presented it as speculation (and good speculation, at that). I'm just pointing that our understanding of QM is such that it's difficult to speculate in any meaningful way.
"Yes, I seriously believe that consciousness is a product of a natural process. I find that the neuroscientists, psychologists, and philosophers who proceed from that premise are the ones who are actually making useful contributions to our understanding of the mind." - PZ Myers
This is what I was thinking last time I discussed this with Zus. I just couldn't put it to words. Thank you for saying it for me.
Just to make sure we're on the same page though. I'm thinking the uncertainty could be an artifact of contorted space. Sound right?
Very true. I stand by my statement of having only two choices for randomness. It is either random, or it is not random. No third option there. I think we can all agree on that.
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
Let me digress a bit here. I am phrasing my response in terms not of fact but human perception of fact. To quickly come up with an absurd example, let us imagine the existence of a person who is red/green colorblind in one eye and totally colorblind in the other and is alone in the universe. He sees different worlds with different eyes but has no concept of the absence of colorblindness nor can he imagine discriminating red and green.
It that I posit a third, it that I posit something we cannot yet imagine for lack of observing it.
I can to into some esoteric things but before someone [Poincare?] was commissioned to give some nobleman an advantage at gambling chance was chance. There was no concept of chance following multi-instances rules. Multi as in flipping a coin multi times. That concept is relatively recent, 17th c. I believe.
In my "prophetic" mode as a physicist I do expect there will be some day a third mode (or fourth or fifth, at least one we have not thought of today) which will be a much better fit to the evidence.
On one claw I am saying something very trivial. We have not thought of the explanation as yet. On the other claw at one time all of Newtonian mechanics was once in the as yet to be conceived category. On the third claw there is special and general relativity which were needed solely by observations of extreme events whose observation was only of interest to scientists.
When I was a lad and expecting to wait decades to say when I was a lad, the universe was steady state and matter was created out of the spreading universe. Later it became a bang time universe where matter was created out of nothing. Created is what connects them. Something from nothing in a universe where matter and energy are conserved, neither created nor destroyed.
I fully expect the next commonly accepted idea of where it all came from to be different from both and something which observation has not forced us to imagine.
This is deviating from Zus's point which is unfair to him. Wolfram of Mathematica fame not Wolfram & Hart has an online book on a rule-based universe and you can also drop $140 or so for it from the publisher. So far as I am aware there is no translation between a rule based and a math based discription of events but I have not dropped the money nor slogged through it online.
=====
My mention of my degree field is intended solely as a matter of introduction not intimidation. Shitloads of similar experts disagreed with what we expect today, among them Einstein and certain aspects of quantum mechanics in his assertion of an underlying deterministic process.
Jews stole the land. The owners want it back. That is all anyone needs to know about Israel. That is all there is to know about Israel.
www.ussliberty.org
www.giwersworld.org/made-in-alexandria/index.html
www.giwersworld.org/00_files/zion-hit-points.phtml
As to 1) on the assumption the effects of gravity are no faster than the speed of light then there is no connection via gravity to matter that expanded in the opposite direction from us. One may guess. One may assume. But the math which describes gravity does and cannot be applied to matter that went in the other direction. As we know only the math we cannot say anything about that other part which satisfies our method of understanding of what we can observe.
The issue as I understand it is the uncertainty as to the cat's mortality means it is both dead and alive but in alternate realities. It cannot be known until one looks. Observers in both realities can look but there is no requirement they look at the same time. Nor is there a requirement that anyone ever look.
In other words every event to which random decay applies gives rise to essentially an infinite number of equally real realities BECAUSE OF OUR MATH not because of reality.
Thus our manner of describing reality is either incomplete or the universe is impossible for us to understand at this point in time.
Jews stole the land. The owners want it back. That is all anyone needs to know about Israel. That is all there is to know about Israel.
www.ussliberty.org
www.giwersworld.org/made-in-alexandria/index.html
www.giwersworld.org/00_files/zion-hit-points.phtml
Let me try a different approach. The following statement is either true or false:
The universe has a level of randomness to it.
It's true or it's false. It can't be kind of true, or a little bit true. It's one or the other.
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
Let me respond.
I have no idea what rondomness is. I can only discuss its overall behavior in the case of a very large number of instances and the ASSUME my description is correctly described by the math.
I only know that no deteriministic process can be random nor can a random process be deterministic, i.e. you can't drain energy from background quantum effects, i.e. Maxwell's demon is.
However that is only what I know. I attempt to make what I know agree with observed facts.
That I claim to know these does not preclude a best described of reality by a concept we have yet to invent, yet to conceive in fact existing whatever it might be.
Consider seriously the many of the reasons so many of us are atheists are from ideas -- views of reality -- which did not exist for most of human history. We have barely begun to understand reality.
Jews stole the land. The owners want it back. That is all anyone needs to know about Israel. That is all there is to know about Israel.
www.ussliberty.org
www.giwersworld.org/made-in-alexandria/index.html
www.giwersworld.org/00_files/zion-hit-points.phtml
Specific deterministic processes can match the behavior of mathematically random processes to an arbitrarily high degree of precision, as in pseudo-random number generators. Real processes typically are deterministic with a degree of randomness - quantum scale effects guarantee at least a minimal level of randomness. Chaotic systems are deterministic, but if they have been running for a while, they are as close to random as makes not even a theoretical difference. Most obvious example which comes to mind is the motion of molecules in a gas, which is one where Maxwell's demon was classically invoked.
Of course, systems that have already reached a state of maximum disorder cannot be changed to a state of less disorder without the input of energy, as per the Second Law of Thermodynamics, so you cannot extract energy from a gas by Maxwell's demon. But note this is still true even if a system consisting of gas molecules bouncing of each other is assumed to be a strictly deterministic system, which it is in fact, apart from quantum scale uncertainty.
Favorite oxymorons: Gospel Truth, Rational Supernaturalist, Business Ethics, Christian Morality
"Theology is now little more than a branch of human ignorance. Indeed, it is ignorance with wings." - Sam Harris
The path to Truth lies via careful study of reality, not the dreams of our fallible minds - me
From the sublime to the ridiculous: Science -> Philosophy -> Theology
ZuS is certainly getting his philosophical rocks off with this one.
Yes but ... no matter how high the degree of precision it cannot be random. Similarly no matter how long the calculation time pi cannot be given. I presume with a sufficiently advanced degree in math I could give a totally incomprehensible reason why random and pi are different types of impossibilities. Not being a mathematician I consider both the same as flying outside of the dreamworld.
Keep working on that. You might be closing in on the third way.
In all of this we do have several minor problems remaining. A disordered birthing universe at bang time results in order in the beginning and then proceeds to disorder forever after. And here is the hitch in our perception. It is hard to find any ancient concept of "religion" where chaos is not an identified enemy of us good guys. Is it not odd beyond comprehension that entropy is chaos without a moral dimension? Did our ancient ancestors have an intuitive knowledge of thermodyamics? Or are we limited in our means of expressing concepts? I opt for the latter as the former would require a total revision of every aspect of my view of the universe. However the universe is notpredicated on saving me from the inconvenience of total revision so I know of no way to say which it is.
Because of that I opt for a third way of statiscal thermodynamics, one that considers the group behavior of mass and velocity of very large numbers of objects. But here I run into the same problem. It is statistical thermodynamics and replication within experimental error is the best I can do.
That gets us back to what you said, to an arbitrary degree of precision. In this case an arbitrary degree of experimental error.
And every time there is any process whatsoever the nature of our models to understand any of them is hobbled by the random to deterministic divide.
Perhaps the universe is incomprehensible. But as we are barely four centuries into gettting a handle on the idea the universe can be comprehended I suggest declaring it incomprehensible is a bit premature.
I have no idea what the universe is like. I have found it a fascinating challenge to simply identify my presuppositions as to its nature and discard those which are without evidenciary foundation. One of the things to discard is an either/or about the universe based upon present means of conceiving it.
Jews stole the land. The owners want it back. That is all anyone needs to know about Israel. That is all there is to know about Israel.
www.ussliberty.org
www.giwersworld.org/made-in-alexandria/index.html
www.giwersworld.org/00_files/zion-hit-points.phtml
The Universe at the singularity is the opposite of disorderly/chaotic - it has effectively been reset to zero entropy. Even if the genesis of the Big Bang singularity arises from a random quantum event in some metaverse, that is not strictly 'chaos'.
Whatever that quibble, chaos/randomness are most definitely not our enemies, they are the ultimate source of creativity, including the origin and evolution of life. So those 'insights' about chaos are intuitively understandable and factually completely wrong. 'Ancient wisdom' is mostly crap.
Rather than a 'third way', I see the opposite - 'pure' randomness and 'perfect' determinism are two ends of a continuum.
In the real universe, as distinct from that of mathematics, Uncertainty at the Planck scale means that once the precision of either our 'simulated' randomness or our calculation of a transcendental quantity reach the Planck scale in a specific real world application, any further precision adds nothing. The simulated randomness is indistinguishable from 'real' randomness, any further precision in pi would be unmeasurable.
IOW, the QM tells us that the real world does not allow for infinite precision. This is consistent with another QM principle, that the state of a system bounded by a finite volume is completely specifiable by a finite amount of information.
Favorite oxymorons: Gospel Truth, Rational Supernaturalist, Business Ethics, Christian Morality
"Theology is now little more than a branch of human ignorance. Indeed, it is ignorance with wings." - Sam Harris
The path to Truth lies via careful study of reality, not the dreams of our fallible minds - me
From the sublime to the ridiculous: Science -> Philosophy -> Theology
Exactly right. One very definite possibility is that the universe contains a fine structure at the planck scale. Some models of the universe predict a "quantum foam," a substrate that forms the environment in which the quantum occurs. Others (like dynamical causal triangulations) predict that the structure is completely fractal. In both models, our uncertainty is due to a universe that is not homogeneous at quantum scales. It'd be like a ball in a pachinko machine: the outcome is chaotic, but due entirely to deterministic processes.
I'm kinda leaning that way myself. As string theory has stagnated for a while, I'm losing hope that it's the correct trail. I'm really excited by the works of Lee Smolin, some of which is based strongly on work by folks like Renate Loll. (I can't read Loll, though. His math leaves me feeling like a second-year primary student trying to understand plasma physics. Which is probably apt.)
Anyway, just like string theory sparked my interest in physics, CDT and loop quantum gravity make me wish I would've continued my study of physics instead of getting sidetracked into computers.
"Yes, I seriously believe that consciousness is a product of a natural process. I find that the neuroscientists, psychologists, and philosophers who proceed from that premise are the ones who are actually making useful contributions to our understanding of the mind." - PZ Myers
Or more aptly, the outcome seems chaotic. It would be determined by the exact position and velocity of the ball, and of course the design of the pachinko machine. If you could manage to place the ball in the same spot with the same velocity, you'd expect to get the same result.
There are advantages to computers. When programming, you more or less get to play God. Making new physical laws and interactions at a whim and such.
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
Since REset is a period universe we can't run with that. But as to early order in this universe I was thinking of galaxies which are as far back as Hubble has seen and so far a bit too early. I am confident the new cameras and the new telescope are going to continue to make very much too early.
I suspect a fourth way (steady state, big bang, cyclical being three) here that in fact galaxies were the beginning. This digresses to my speculation not the subject other than even the anisotropy of the background radiation cannot be assigned to anything after the bang by other than assumptions about the bang.
There is no such thing as "pure" random. There are only random and deterministic without a spectrum in between. Pseudo-random means it is not random.
Indistinguishable from is the mind of an engineer. Having also worked as an engineer I have no problem at all with that. But unmeasurable as every engineer knows has to have the caveat of unmeasurable today.
By a specifiable finite amount of information at specific time shorter than a time shorter than any quantum event can occur.
That is an assumption required by our method of measurement based upon our current way of looking at things. It is not something which can ever be verified.
Jews stole the land. The owners want it back. That is all anyone needs to know about Israel. That is all there is to know about Israel.
www.ussliberty.org
www.giwersworld.org/made-in-alexandria/index.html
www.giwersworld.org/00_files/zion-hit-points.phtml
I understand what you mean. All I'm saying is it's either random or pseudo-random. The difference may not matter to us, but it's still a difference.
By the way. Anony's reply about the speed of gravity has me curious. Does anyone know the speed of gravity? I tried looking it up but I couldn't find a conclusive answer. I get different sources ranging from equal to the speed of light to more than 20 billion times the speed of light.
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
Since no one has tackled this I guess it is up to me. I was hoping someone would say something I had missed. Anyway I have come across only two speculations on the speed propagation of a change in gravity, a gravity wave. Those two are the speed of light and instantaneous. I have not come across anything else. Do you remember where you heard of what you suggest?
Speed of light is simply don't fuck with Einstein. Instantaneous is the only other answer in light of no basis for anything different from light. So they are just guesses. Clearly it has to propagate at some speed else it would all back up at the source which is an interesting concept in itself but I don't see what to do with it.
A few years ago I was reading of a gravity wave sensor being built. I haven't heard of it since then. I don't know what is going on. Anyway if a gravity wave is detected and a supernova appears at about the same time then gravity propagates at the speed of light. After lots of data collection over decades to centuries it might be possible to say they are exactly the same or slightly different. If no connection between supernovas and gravity waves is found then all bets are off and new theory is needed. Faster or slower or instantaneous and I guess not at all would all be up for grabs.
The universe never asked my opinion before it decided how it had to be explained.
Jews stole the land. The owners want it back. That is all anyone needs to know about Israel. That is all there is to know about Israel.
www.ussliberty.org
www.giwersworld.org/made-in-alexandria/index.html
www.giwersworld.org/00_files/zion-hit-points.phtml
I think 'pseudo-random' is used, reasonably, because we know the deterministic process behind the devices we use to generate such data. When we observe a process in nature which appears purely random to our best measurements, like radioactive decay, that doesn't prove there is such a thing as 'pure' randomness in other than a conceptual or mathematical sense.
I refer to the example of the trajectories of molecules in a gas. They are effectively as random as makes no difference, yet we do not need to assume QM uncertainty to explain why this is so, even if they are behaving purely according to deterministic laws of dynamics, simply because of how greatly the directions in which two colliding spheres rebound will change for small differences in the exact point of collision, ie from 'glancing' impact to full head on collisions. This magnification of small differences in the initial velocity of each molecule, with each collision, which can be trillions per second under typical conditions, leads to as close as you like to pure randomness without requiring either Quantum effects or specially designed systems as in pseudo-random number generators.
That is an example of how effectively random behaviour can emerge from deterministic systems in very natural ways. 'Hidden variable' ideas speculate that analogous 'deterministic' processes could underly apparently 'uncaused' or random phenomena at the Quantum level.
About the speed of gravity, the evidence for light-speed propagation would ideally be direct detection of gravity waves. Some of the best indirect evidence I think comes from measurements of the decay of the orbits of massive objects like neutron stars orbitting around each other, which I gather is consistent with loss of energy due to the generation of gravity waves.
Favorite oxymorons: Gospel Truth, Rational Supernaturalist, Business Ethics, Christian Morality
"Theology is now little more than a branch of human ignorance. Indeed, it is ignorance with wings." - Sam Harris
The path to Truth lies via careful study of reality, not the dreams of our fallible minds - me
From the sublime to the ridiculous: Science -> Philosophy -> Theology
I'm not going to speculate about hidden variables as I am not really qualified to. I don't have knowledge required to make any sort of educated case for or against. I will say however that this is exactly the point I was making to Anony. "Seems random" is not the same as "is random" even if they may be indistinguishable.
I was actually just about to respond to Anony with a similar post on the point of gravity propagating at a finite speed. It is necessarily so to explain the loss of energy seen in orbiting bodies.
On the subject of aberration in light speed gravity waves and it's effects regarding the stability of orbits... I've seen a fair amount evidence that the stability of an orbit can be maintained with a sort of balancing force much in the same way as electromagnetic orbits even with the vast difference in distance. That is to say, the GR model can produce accurate orbits that account for gravity waves referring to retarded positions. This doesn't prove that the waves propagate at the speed of light but it definitely lends support.
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
It didn't ask anybody. I told it how to behave. I just can't remember what I said...
On a serious note. There is possible evidence for the speed of gravity and light being the same. I'm not aware of any in favor of instantaneous gravity, and there is plenty of evidence to the contrary. I think we'll just have to wait until we can measure it directly.
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
Actually, no. What you are talking about is merely evidence that there are gravity waves. It has nothing to do with how fast they propagate.
Let me digress. A gravity "wave" merely refers to a change in gravity in one place affecting the masses around it. Consider a bullet. From time of firing to point of impact. That is a "wave" in the sense of this discussion. It has nothing to do with ripples in a pond or anything like that.
As you are just talking about the exitence of gravity waves there is no way to determine their speed from there existence. We have to actually detect a gravity wave and note its time of arrival before we can begin to talk about speed.
There are two obvious possibilities here. The hoped for and expected one is that the detection of a gravity wave will coincide with a visual (speed of light) astronomical event such as a supernova. The undesired event will be no corelation with visual events. If a gravity wave leads or lags an optical event within a reasonable period of time then a speed can be determined. Maybe a photon is the only thing which can come close to the speed of gravity and therefore we need only put a slightly larger number into Einstein's equations and nothing significant changes. Ditto if gravity is the only thing that can come close to the speed of light. Either difference will be interesting and have huge ramifications but only physicists are likely to think it of interest.
Another possibility is no matter how sensitive no gravity waves can be detected. That will require an entirely new physics to explain.
The possibility is that waves are detected with no connection to optical events. That will require a major investment is sensors and data collection before any sense can be made of it. For example, no connection could mean instantaneous so the optical cause may be thousands and more years in the future. Or simply the gravity is such a small fraction of light that it may only take only a century to detect the optical event but not in our lifetimes.
Chum the waters and see what bites.
Jews stole the land. The owners want it back. That is all anyone needs to know about Israel. That is all there is to know about Israel.
www.ussliberty.org
www.giwersworld.org/made-in-alexandria/index.html
www.giwersworld.org/00_files/zion-hit-points.phtml
A wave is entirely different from a moving solid object such as a bullet.
It is the propagation of changes in some medium or field, where the medium or field does not actually travel with it in any sense. What is traveling is energy, which has a mass equivalent, and so is explicitly limited to the speed of light, just like a photon, which has zero rest mass, so is effectively just propagating energy.
The rate of energy flow is closely related to the intensity of the changes and the velocity of propagation. The observations of the rate of energy loss from binary neutron star systems is consistent with the masses involved and their motions, using the model of gravity waves consistent with General Relativity and a velocity of light-speed.
I am pretty sure that the idea that gravity changes propagate at light speed is directly connected with the observation that the orbit of Mercury is not quite consistent with Newtonian gravitational theory, but is successfully explained by General Relativity. EDIT: Which was one of the earliest successes of GR.
Favorite oxymorons: Gospel Truth, Rational Supernaturalist, Business Ethics, Christian Morality
"Theology is now little more than a branch of human ignorance. Indeed, it is ignorance with wings." - Sam Harris
The path to Truth lies via careful study of reality, not the dreams of our fallible minds - me
From the sublime to the ridiculous: Science -> Philosophy -> Theology
As usual you are spot on. The reason I say "possible" evidence is because the experiments using the neutron stars have been contested as merely proving the speed of light. I don't know enough about astronomy or physics to say either way.
Edit:
Anony, current observation of gravity puts gravity at a finite speed. What is that speed is the question now. Though whatever the speed may be, it is not instantaneous.
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
Here's how the computer does it:
How much evaluation you do depends on how precise you need to be. It's a tradeoff with the power of the machine. However, internally the power of the machine is not a limit! While you might not be able to replicate something outside of your model, you can still do a heck of a job and the inside of it will never fail.
This results in the model being a 100% precise replica of the universe, if the model is all that exists for your perspective. I realise this is the 22 catch of lack of falsifyability, but there you have it. I realise this might be not entiraly satisfying for you because it's not for me either.
YES - which is what is so awesome. You have a perfectly deterministic world, in which the state of cat is actually not evaluated yet - unless you count the cat as a key observer, in which case you need to evaluate at least some of what it perceives.
It depends - some or many states and factors that figure in might be evaluated already, and you would never need to go back past what is already evaluated. It depends on who the key observer is. Again, it does not have to be the case that we are the one's observing, we might be activated in some way as a side effect of the real observer. Like working memory is activated in a computer, it's still not the point of the powerup. But yes, if you need everything evaluated, it will be - flawlessly.
What I think is the great benefit of looking at it this way, is:
1) relativity of time is not weird - time is not an issue in lazy evaluation. It actually counts on time being distorted and this is perfectly intuitively understandable. Past is now, future is deterministic and deterministically unknown.
2) Infinity is not an issue, you just evaluate what you need of the reality, and there can always be more. We actually have infinite lists of things in programming languages, and even in our own speech
3) To use a gaming term, you can NEVER "fall through the floor" upon entering a room, since you do not enter untill the room is done evaluating.
Logic is a systematic method of coming to the wrong conclusion with confidence.
I get what you're saying now. As long as precision isn't a big deal you can be very sloppy in your calculations. It doesn't matter if this star is three hundred miles from it's actual location, we would never know by looking at it from earth. So things only have to be precise enough to put them in the right pixels. Is that the gist of it?
I'm not aware of any infinite lists in programming. Could you elaborate on that?
I hate blue hell. I can easily ruin an otherwise awesome game.
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
When modelling stuff - yes. But once the model is made, internal to the simulation the computation does not fuck around - every calculation is exactly according to prescription.
BUT - internal to the system you don't have to have calculated things to assume they are there. Let's talk about a computer model of a cake.
If you have a prescription for a cake, you don't really need to make it (create it's full specification) to assume it's there in your calculation. You can work with it's visual representation correctly, without ever calculating it's mass, smell or consistency thoroughly. So you could be looking at a cake, then go to it and your observation of it's taste and touch would fit the earlier observation just fine. Added more calculated information on how it feels to be poisoned by arsenik, if your wife made the cake after finding out you cheated on her.
This is the way functional programming and lazy evaluation use mathematical functions and on-the-need-to-know lazy evaluation to calculate what's going on, extreme layman terms. Time doesn't matter, past and present are both here in the calculation.
So what I am saying about gravity is that it is calculated precisely, but taking only some variables into account. It could be instantaneous as well, since it's just a computation and we have predefined steps (quantum of time) - we define how it works. If you lived in a simulation, this would be the 100% truth - the laws of nature.
This is almost like a religion in computer science - functional programming based on lambda calculus. All the major imperative languages have adopted some form of lazy evaluated infinite lists, because they are just awesome.
An infinite list needs the following:
1) a way to prescribe it's elements
2) a way to dig into past and the future in order to get the ingredients needed for the prescription
3) a way to not have to instantiate the whole list at once, just like you want to be able to increment numbers to infinity, but not have to do it every time you increment a number
If we depended on actual initiated structures to tell us what state our list is in, it would be very hard to keep all the needed information around. Even if we could, the information could be corrupted with time, and our list would be useless. You can intuitively conclude that functional languages are better at this, since they have no states or mutable structures that they keep around. They are basically the prescriptive functions - f(x) is always just f(x) for any x, and if your x is also a function - you have 0 liability. If everything is a prescription that just represents the cake when needed, you can always have the cake just right, and since it's a prescription, it can go as deep into describing the cake as necessary - to infinity.
This is the rough outline, I have a bunch of articles you might want to read though.
Logic is a systematic method of coming to the wrong conclusion with confidence.
doublepost
Energy is NOT traveling. Energy per se does not exist. Things which have the dimensions of energy exist. In the case of a bullet, kinetic energy.
If you don't mind my saying a zero rest mass of a photon is like squaring the circle. A photon cannot have zero velocity. Zero rest mass is like a squared circle. It cannot exist by (almost) definition.
Rather the assumption of zero rest mass causes the equations to make sense. If I assume something else you have every right to ask me to demonstrate it. But I also have the pre-existing right to require a demonstration that a stationary photon has no mass.
Any pretension to guess this is one of those things like the precession of the orbit of Mercury. I once made the mistake of researching it in 3rd year college. The math of the things which were known by Newtonian mechanics were so far beyond me that I could get no where near the relativistic part. So also it will be with neutron star binaries but worse. Consider the contribution of mass transfer between them their supposed egg shape and a host of other things.
Possibly and it is way to long ago to mentally recall that research. I do not think propagation velocity would matter as the mass of the sun was a constant -- and no way to measure if it were not and the perturbation is constant not related to a changing mass. Of course I can be completely wrong.
As for the edit, it was set up as THE THEE ITSELF test of GR and by implication SR. After that only a few diehards looked for alternate explanation. I forget the name but I read his work in the 60s but then around the late 70s he published his intention to give up as he had exhausted all the alternatives he could think of and none had worked out. I would point out his intent was to find something simpler which would give the same results. I remember very little but the probably incorrect memory of one of the things he explored was an invariant scalar. Beyond that I have forgotten just about everything I every knew about GR.
Jews stole the land. The owners want it back. That is all anyone needs to know about Israel. That is all there is to know about Israel.
www.ussliberty.org
www.giwersworld.org/made-in-alexandria/index.html
www.giwersworld.org/00_files/zion-hit-points.phtml
I'm not sure what you're getting at. In what way are these lists infinite? Are you saying that they are explicit and thus any arbitrary value can be passed into them? What you're describing sounds recursive to me.
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
I rather think these are just lists which can be arbitrarily large in principle, but in practice can be populated with actual data only to the limits of available (including virtual) memory. You could even assign each actual element an index which could be quite large, as though indexing an actual continuous list, even though only the elements in the list actually referenced need be instantiated (a 'sparse' array). Normally there would be an actual finite limit to the index, maybe 32, 64 or 128 bit.
You could use a fancier integer format which allowed the number of bits to be as large as desired, but of course never actually infinite.
In floating point formats, numbers can be much larger, but of finite precision, and can have explicit code for infinity (eg INF and -INF, which will be returned as the result of a non-zero number divided by zero. This can still be used in further calculation, eg 1/INF => 0, and 3 * INF => INF, in fact any function which approaches a defined value as its input approaches infinity, whereas in simpler representations, once you get a divide by zero, you 'crash'.
There will also have a code for 'undefined' numbers, such as the result of zero divided by zero.
Favorite oxymorons: Gospel Truth, Rational Supernaturalist, Business Ethics, Christian Morality
"Theology is now little more than a branch of human ignorance. Indeed, it is ignorance with wings." - Sam Harris
The path to Truth lies via careful study of reality, not the dreams of our fallible minds - me
From the sublime to the ridiculous: Science -> Philosophy -> Theology
I totally agree. When he started talking about "recipes" I though maybe he was talking about some sort of constructor that would make a new object when needed that would be discarded as soon as it was not needed. Maybe I'm not on the right path, but I imagined some sort of classification tree that would have to be traversed and appended accordingly to define any object in the model.
Obviously we aren't really going to entertain the idea of infinite sized lists as a possibility.
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
We are not going to entertain the idea, we are going to implement it in reality
Recipe is just the prescription of how the list is to proceede, if it needed to expand beyond the current actual size. The infinity is not that it can be infinitely large, although that is a consequence (as long as there is space to expand to). Infinity is that it can expand infinitely - and will always be correct.
It is a presctiption not based on state or any current data, so corryption of data will not make it less correct. Lazy evaluation makes it possible to "reach back and forward" as far as needed to calculate the history of what is needed to be known right now, effectively eliminating time as an issue - no need to keep up with progressing states, which can be corrupted. So, no need to calculate everything at the same time to have an apparently concurrent simulation. Finally, these prescription and lazy evaluation allow the list not to have to be instantiated at any point in it's fullest, so the infinite largesse is not a problem - internal to the system infinity is a fact.
The thing to appreciate is that you don't need to have a list fully populated to calculate everything around it as if it was fully populated and do so correctly.As for "putting different things into the list", that a list has to be of a certain type is entirely for human purposes. Most functional languages are strongly typed, but allow for creation of types on the fly. imperative laguages are adopting this method lately, like the way .NET is grouping data from in-code database quieries - you don't know what type it is and the type really doesn't have a name either.
Logic is a systematic method of coming to the wrong conclusion with confidence.
That is precisely why we can not have infinitely large lists. The way we store information requires a storage medium to hold the data. In order to store an infinite amount of information, we would have to have access to in infinite amount of matter/energy. Even if they were possible in reality, they are not possible to program.
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
I think "arbitrarily large", or "without explicit limit", would be more accurate way to describe these lists, rather than "infinite".
Which order of transfinite number (in the Cantorian sense) did you have in mind? There are an infinite number of different infinities, actually.
And where the hell does 'data corruption' potentially come from in the context of a computer algorithm??
Favorite oxymorons: Gospel Truth, Rational Supernaturalist, Business Ethics, Christian Morality
"Theology is now little more than a branch of human ignorance. Indeed, it is ignorance with wings." - Sam Harris
The path to Truth lies via careful study of reality, not the dreams of our fallible minds - me
From the sublime to the ridiculous: Science -> Philosophy -> Theology
Agreed.
I was thinking the set of all integers greater than or equal to zero. Countable but not finite.
The problem is not data corruption, it's memory reference. How do you request the information at a transfinite location? The DWORD value is going to have to be finite, and the number of them used to reference the memory will have to be finite. As such, you will always have an upper bound on the memory you can reference. Even if it is arbitrarily large it's still not infinite.
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
Internal to the system, for all intents and purposes, the list is infinite. If you ran out of storage, outside of the system it would seam that the system stopped, but the computation would be completely content just waiting for more storage, since the time INSIDE OF THE SYSTEM is totally defined by the progress of the algorithm. A second in our reality could be aeons outside of our reality, that would be no problem, since we experience only the system. We could be even standing completely still compared to the outside of the system, not an issue at all.
So much for infinity. Now I will try to make lazyness a bit clearer. There are two ways to calculate everything:
1) provide a recipe for everything with no states, or
2) provide a recipe for creation of states and a recipe for what to do with existing states depending on the information in the states.
The first is obviously implementable using lazy evaluation. The second not so much - if you want lazy with states, you have to keep track of ever growing amount of states. It is also unclear how you will calculate interdependant things, like gravity, since you only have prescriptions that work on instantiated states. You would have to have all the states ever made to be able to calculate history of any given state. States are sort of like the prices in the "free market" - all the information about the product summed up in one state, completely impossible to calculate history of the creating of the price from the price itself - no prescription will help you without knowing ALL the factors (states) that ever had influence on the price. On the other hand, if gravity is just another prescription with access to all the other prescriptions, you don't really need to calculate all of them to be able to push them through as arguments to functions we need calculated now - gravity function is just a constraint, an argument the calculation has to obay by. The moment it needs exact information about something, it calculates the history of the needed aspect, still obaying the constraints and making sure it will fit perfectly with the rest of the program.
Given, functional languages are much more processing intense and tend to run slower than imperative languages with states, I usually ignored them completely in my work. The theory is also somewhat counter intuitive (I dare you to study category theory). But they do not demand much space, they are not succeptible to corruption of states (remember, we do NOT have prescriptions for everything if we use states, we depend on the states being correct - data corruption is catastrophy), they can implement lazy eazily and you will never enter the blue room that just didn't get rendered because of lack of memory. Given, the machine might stall, but it will be an internally correct calculation that just happens to look very still from our perspective.
Logic is a systematic method of coming to the wrong conclusion with confidence.