Jump to content
BrainDen.com - Brain Teasers
  • 0


Guest
 Share

Question

This is a spin-off from rookie1ja's Lazy-Bones Paradox (if there is a destiny, why bother going to the doctor's when ill?). A belief in destiny may lead to bad decision making, but that's not to say the belief itself is incorrect.

I'd like to get some thoughts on the Destiny vs Free Will subject, but first let's get a few things out of the way:

The question of destiny doesn't depend on some quasi-religious notion of a "master plan". Destiny may simply exist without anyone knowing what the "plan" is, perhaps just as a consequence of physics. If the current state of the universe and the laws of physics acting upon it dictate all that happens, then this determines the future regardless of whether we can predict it. In my opinion destiny simply requires there to be just one possible future.

Clarification of "possible": "Possible" is often taken to mean "something we do not know to be untrue (or impossible)". If I bought a ticket for last night's lottery but haven't checked the results yet, and you ask me "Did you win the lottery?", I might answer "It's possible, I don't know yet". In reality, the outcome is already determined, so my winning the lottery is only possible if it actually happened. I either won or I didn't, I just don't know which it is, so I used the word "possible" to indicate a lack of knowledge in this case. But that's not what I mean when I say "one possible future". I mean only one future which may happen (regardless of knowledge).

Picture a hypothetical observer standing outside of time. Would they see time as a line, as a single sequence of events from the distant past to the distant future? If so, however unpredictable the future may be, destiny is a reality. In this case, the notion of "free will" may be a useful one, but it is an illusion (caused by our inability to keep track of the underlying mechanics, the cause and effect which dictates our every thought). You might say that those who believe in destiny and make bad decisions because of it were destined to do so, and those who believe in free will and make good decisions because of it were equally destined to do so.

You might argue that Free Will can exist alongside Destiny. Consider this example:

You've been kidnapped and locked in a room with a red door and a green one. You are told "You have the freedom to leave the room by whichever door you choose, and accept the consequences". So you choose (say) the green door, which leads to a reward and an exit. Later you find out that the red door was a fake door with just a wall behind it. The maker of this room (having studied the way you think in infinite detail) knew that you were certain to choose the green door and therefore didn't bother building a second exit. It's true that you had "the freedom to leave the room by whichever door you choose", since you would only ever have chosen the green door, regardless of how "free" you thought your choice was. Freedom doesn't necessarily mean that there is more than one possible outcome.

For the purposes of this debate, however, I would like to define "Free Will" as the ability to make more than one possible choice. Which makes it utterly incompatible with Destiny.

So it's a fight to the death. And I propose that the deciding factor is whether or not we have more than one possible future.

Let battle commence!

Link to comment
Share on other sites

  • Answers 129
  • Created
  • Last Reply

Top Posters For This Question

Recommended Posts

  • 0

Not only would that happen, but it would happen an infinity of times. That's what I've always struggled with - how you measure probability with infinities? Some probabilities are inf/inf but "infinitely likely" while others are inf/inf but only "infinitesimally likely". Is there a formulaic way to use limits to take probabilities to infinity, or are we just BSing our way through that stuff?

Could it perhaps have to do with the cardinality* of the sets? :unsure:

Maybe if the sets have the same cardinality, or the cardinality of the denominator is less than the cardinality of the numerator, it will be "infinitely likely," but if the cardinality of the denom is higher than the numerator's, then it will be "infinitesimally likely?" I'm just making this up as I go along, so I have no idea whether set theory could even be applied in this situation. :wacko:

* It's kind of long-winded and some of it isn't stated that well, but this link talks about cardinality and infinite sets for anyone interested.

Link to comment
Share on other sites

  • 0
* It's kind of long-winded and some of it isn't stated that well, but this link talks about cardinality and infinite sets for anyone interested.
I thought it was pretty concise considering the material covered. The proof that real numbers are uncountable is definitely not trivial and it was a very short and clear explanation of that (though I'd call it a proof by contradiction personally).

The question of probability is one that really does my nut in too.

Take a really simple example. Picking a positive integer at random from the set of all positive integers (never mind how), what's the probability it is divisible by 10?

1/10, right? But how do we know? It seems pretty obvious because of the distribution of multiples of 10 within integers. If you take a large interval the multiples of 10 will always be about 1/10th of the integers in that interval, the larger the interval the less variation there is to that proportion.

But you could re-order the set like this:

0,1,10,2,20,3,30,4,40,5,50,6,60,7,70,8,80,9,90,11,100,12,110,13,120...

It's not an elegant example but it will do. The odd-numbered items are multiples of 10 in order, the even-numbered items are non-multiples of 10 in order.

In this set every second item is a multiple of 10. An arbitrarily large interval of this set would consist of 50% multiples of 10. Surely then, the probability of picking a multiple of 10 from this set is 1/2. But it's exactly the same set!

But where this really gets me confused is when you start wondering about whether, given infinite varieties of universes, there is a probabilistic justification for Occam's Razor.

I can't help but feel that, where I said "never mind how" earlier, I was skipping over something important.

Link to comment
Share on other sites

  • 0

I can't help but feel that, where I said "never mind how" earlier, I was skipping over something important.

I think you're right. I think it's impossible to "pick" a random number from an infinite distribution. How would you do it? In math a lot of times if you do an impossible operation (like dividing by zero) you end up with conflicting results... simple example:

x=0 so x+x = 0

x = 2x

divide both sides by x,

1 = 2

Of course, this is wrong because dividing by zero was an illegal operation. Better versions of this disguise it better and divide by x somewhere in the middle after you've forgotten that x=0, but the idea is the same.

The analogue of that might be what you were saying, somehow selecting a single integer out of infinity.

Link to comment
Share on other sites

  • 0
What is Occam's Razor exactly? I've never heard of it.
http://en.wikipedia....ki/Occams_razor

Maybe that doesn't encapsulate entirely what I meant anyway, see later...

The analogue of that might be what you were saying, somehow selecting a single integer out of infinity.
Certainly the how of it presents a problem. It's hard to think of a case where you'd be selecting a random outcome from infinitely many. Even in the quantum RNG example there are finitely many distinguishable outcomes to any given test. So I'll get back to Occam's Razor and multiple universes. Let's suppose that our universe is a possible physical system which is but one of infinitely many, as many as there are valid mathematical models for such a system. Some such systems will be simple and elegant. Others ridiculously overcomplicated. Some will support life. Others will not.

We know that our universe is one of those complex enough to have embedded self-aware life forms. What can we imply, probabilistically, about the sort of universe that we have? Is it possible to conclude that our universe is likely to be only just as complex as it needs to be in order for self-aware life to exist? For every such universe, you could conceive of an infinite variety of unnecessarily complex variations that would do the same thing. So have we any reason to believe that physics is not unnecessarily complex?

This is a true "pick one out of infinity" type problem, and no doubt presents some very slippery pitfalls of reasoning.

But it would tend to go one of two ways; either our universe is overwhelmingly likely to be only just as complex as needs be for self aware life, or it is likely to be overcomplicated, not just a little, but to an insane degree such that physics has absolutely no hope of ever being fully understood. For example, we perceive four dimensions (counting time), but various physical models postulate the existence of more dimensions, which are wrapped around small so we don't notice them. A model is generally thought preferable if the number of dimensions it postulates is less. However, if what I'm suggesting is right, we have every reason to think that really there are ridiculously many completely unnecessary dimensions.

As I hinted at earlier, this might not really be strictly to do with Occam's razor. For any given complication, the probability may be is that it is wrong unless necessary, and yet the universe as a whole may still be probably overcomplicated.

Link to comment
Share on other sites

  • 0

I think I get it. The little blurb about opposing heliocentric theories helped (towards the side).

But if it's true, than why does physics always seem so complex? Or am I really not getting it?

Edit: Of course, this is coming from someone who will only be taking basic physics in school about 3-4 years from now, so... =)

Edited by gvg
Link to comment
Share on other sites

  • 0

First I want to say that this subject of probability and randomness involving infinite sets is quite hard to grasp. 1/10 or 50%? Of course it's 1/10, but explaining why it's not 50% is tougher than one would think.

Is it possible to conclude that our universe is likely to be only just as complex as it needs to be in order for self-aware life to exist? For every such universe, you could conceive of an infinite variety of unnecessarily complex variations that would do the same thing.

But it would tend to go one of two ways; either our universe is overwhelmingly likely to be only just as complex as needs be for self aware life, or it is likely to be overcomplicated, not just a little, but to an insane degree such that physics has absolutely no hope of ever being fully understood.

Why these two ways? I require some clarification on your thinking:

If we can conceive of an infinite variety of unnecessarily complex variations that would do the same thing as a simpler universe that just barely allows self-aware life to exist, then to me that implies you're saying that the "simpler" universe is extremely rare and the overly-complex universes are more common. From this we would be tempted to say that it is likely that our laws of physics are extremely complicated.

So then you mention "two ways." Regarding the first way, why would our universe be overwhelmingly likely to be only just as complex as it needs to be to allow for self-aware life? I would guess that this would be because a very large percentage of the mathematically possible universes that you said exist in this hypothetical are only barely complex enough to contain self-aware life. But, given that we're assuming that infinitely many universes exist ("as many as there are valid mathematical models for such a system"), then I think, as you said, there are a great portion of universes that are more complicated than the simplest universe that is still complex enough to contain self-aware life. If this is true, then I don't think that this "way" is likely at all. Rather, the other "way" you mentioned, that our universe is very likely overly-complex seems better.

This is a true "pick one out of infinity" type problem, and no doubt presents some very slippery pitfalls of reasoning.

Undoubtedly, but wouldn't it be great if we could figure out the reasoning and draw some conclusions (or draw some lack of conclusions) on this subject? I think the questions you are asking are very interesting.

So have we any reason to believe that physics is not unnecessarily complex?

Well, in order to believe that physics is unnecessarily complex, I'd first want to know if the presumption that the ratio of overly-complex universes containing self-aware beings to simpler universes that are just barely complex enough to contain self-aware beings is indeed something like ~infinity:1. Also, what about the assumption that these other universes even exist in the first place? What about the assumption that our presence in this universe is a random selection of an infinite set of logically possible universes? And even if the vast majority of universes containing self-aware life were overly-complex, should the beings in the simpler universes falsely conclude that their universe is overly complex? Just because they are a "1 in infinity chance" (I'm getting very curious about this probability thing now) doesn't mean they don't exist.

Just some thoughts. I only read the last page of this discussion. Tell me if I'm missing anything and I'll go back and read so I can better join this discussion.

Link to comment
Share on other sites

  • 0
Why these two ways?
As you've shown yourself, there is some reasoning to suggest that mind-bogglingly excessive complexity is the likeliest state. The middle ground would be a universe which is overcomplicated but not excessively so, which seems like a very tricky thing to justify probabilistically. Either it is drawn to simplicity, or it is not, in which case simplicity goes right out the window.

So then you mention "two ways." Regarding the first way, why would our universe be overwhelmingly likely to be only just as complex as it needs to be to allow for self-aware life? I would guess that this would be because a very large percentage of the mathematically possible universes that you said exist in this hypothetical are only barely complex enough to contain self-aware life.
I know that seems a bit unlikely, and to be honest I don't have a convincing argument to support this, except to point out what a slippery subject this is. It may not be as simple as evaluating what percentage of hypothetical universes are just complex enough. Considering the example of the probability of a random integer being a multiple of 10, I don't think anyone would seriously argue that this isn't 1/10, but I brought the matter into question by re-ordering the integers in a way that deliberately gives more weight to multiples of 10. Maybe there is an appropriate weight to give to all these outcomes and maybe we give the complex universes too much weight by considering them equivalent to the simpler ones. BTW this isn't exactly equivalent to "weight" as normally used in probability calculations. In the case of the integers a re-ordering was all that was needed to apply the weight. In the case of universes the set is probably uncountable so order isn't even an applicable concept, but still the way we think about them might be deceptive.

A pretty silly example:

For every real number A in the range 0<A<0.9 there are infinitely many real numbers in the range 0.9 to 1 which can be generated by inserting a sequence of 9's to the start of the decimal expansion of A.

(so 0.6177388... maps to 0.96177388..., 0.996177388..., 0.9996177388... and so on)

So there are infinitely more real numbers between 0.9 and 1 than there are between 0 and 0.9.

Clearly that's absolute rubbish but the reasoning is analogous to that used with the universes. That kind of reasoning does not work here.

I guess the idea that our universe is only as complex as it needs to be for self-aware life is an appealing one. If there's any validity to it, it gives us a guide to physics. And of course it satisfies humankind's innate wish to be the center of the universe.

I'll freely admit that my other reason for holding out hope for a simple universe is that the alternative gives me the heeby-jeebies a little. It's nice to think that physics isn't just a crazy mess of unnecessary junk. Wouldn't we all like it to be beautiful and elegant in its minimalist simplicity?

Also, what about the assumption that these other universes even exist in the first place?
I'm invoking Occam's Razor on that one :lol:. If the notion of existence can be applied to universes, then any potential (logically consistent) universe may either exist or not. You might think of existence as a true or false property which would have to be determined by some meta-system beyond universes. On the other hand we might simply observe that we don't need a notion of existence. What if this universe didn't exist? Would it change what you are thinking right now? Absolutely not. What you are thinking right now is a property of the system. Circles don't have to exist in order to be round. So why does our universe have to "exist" in order to have the properties that it has? Cut the property of "existence" out of the model and all possible universes are equivalent in that regard.

What about the assumption that our presence in this universe is a random selection of an infinite set of logically possible universes?
I think it's pretty clear that there are infinitely many logically possible universes. But rather than random selection, what I'd suggest is that none can be assumed to exist more than others, and just ask "If you are a self-aware lifeform in one such universe, what sort of universe are you most probably in?"

And even if the vast majority of universes containing self-aware life were overly-complex, should the beings in the simpler universes falsely conclude that their universe is overly complex?
Phrases like "vast majority" are probably best avoided here. But I suppose the answer is yes (see )

Just some thoughts. I only read the last page of this discussion. Tell me if I'm missing anything and I'll go back and read so I can better join this discussion.
I wouldn't bother going back, this is a recent deviation.

Incidentally, another objection to over-complex universes is one of absurdity, and I'm not sure if this is a valid argument. Let's say for example that there may be many dimensions without that stuffing things up, so in an overly complex universe the number of dimensions is more or less the kind of number you would get if you picked a number at random out of all the integers (hey, I found a way to do that!). So, how big would that number be? However big it was, there would be infinitely more integers greater than it than (positive) integers less than it. So it's pretty surprising that it would be such a small number. Like if it was <10, then it would be incredibly small and really indicative of a simple universe rather than a complex one. But the same applies equally if it were <100, or <1000, and so on. I don't know if there's a valid argument in there somewhere, but I feel that there might be the beginnings of one.

Link to comment
Share on other sites

  • 0

I think I found a way to do the infinite probability, using limits which I hinted at earlier.

It would work by having a set of all positive integers from 1 to N, finding probability P for an event E given N. Then take the limit of P(N) as N approaches infinity. If the limit doesn't exist, it's an "invalid infinite probability" so to speak.

Here's an example based on what octopuppy was framing earlier:

~~~~~~~~~~~

If E was picking an integer divisible by 10, then P(N) = truncate(N/10) / N = (N/10 - fractional(N/10) / N

When you take the limit of that as N->infinity, the fractional(N/10) is always less than 1 so becomes insignificant and the limit is (N/10)/N = 1/10

~~~~~~~~~~~

Now considering octopuppy's paradoxical reordering:

1,10,2,20,3,30,4,40,5,50,6,60,7,70,8,80,9,90,11,100,12,110,13,120...

With the new way of doing it, you have to start out finite. So either you have to skip a bunch to retain this finitely, or you have to follow the notation above and go all the way up to N, but then you have a bunch of 'unpaired' non-multiples-of-ten. Specifically the number of paired non-multiples is equal to the number of paired multiples, which can be all multiples because the multiple of ten is always bigger than its non-multiple pair. So the number of pairs is truncate(N/10) and the number of unpaired integers is N - 2*truncate(N/10), which is roughly four fifths of the set. Anyway so do the math and you get the same limit as you take N to infinity, (N/10)/N = 1/10.

So you can see the real source of the paradox is that the number of unpaired integers, which is 4/5 of the set at each increasing-N finite set, can be meaninglessly zero in the infinite version of the set, because 4/5ths of infinity is meaningless.

Link to comment
Share on other sites

  • 0
I think I found a way to do the infinite probability, using limits which I hinted at earlier.

It would work by having a set of all positive integers from 1 to N, finding probability P for an event E given N. Then take the limit of P(N) as N approaches infinity. If the limit doesn't exist, it's an "invalid infinite probability" so to speak.

Here's an example based on what octopuppy was framing earlier:

~~~~~~~~~~~

If E was picking an integer divisible by 10, then P(N) = truncate(N/10) / N = (N/10 - fractional(N/10) / N

When you take the limit of that as N->infinity, the fractional(N/10) is always less than 1 so becomes insignificant and the limit is (N/10)/N = 1/10

~~~~~~~~~~~

Now considering octopuppy's paradoxical reordering:

1,10,2,20,3,30,4,40,5,50,6,60,7,70,8,80,9,90,11,100,12,110,13,120...

With the new way of doing it, you have to start out finite. So either you have to skip a bunch to retain this finitely, or you have to follow the notation above and go all the way up to N, but then you have a bunch of 'unpaired' non-multiples-of-ten. Specifically the number of paired non-multiples is equal to the number of paired multiples, which can be all multiples because the multiple of ten is always bigger than its non-multiple pair. So the number of pairs is truncate(N/10) and the number of unpaired integers is N - 2*truncate(N/10), which is roughly four fifths of the set. Anyway so do the math and you get the same limit as you take N to infinity, (N/10)/N = 1/10.

So you can see the real source of the paradox is that the number of unpaired integers, which is 4/5 of the set at each increasing-N finite set, can be meaninglessly zero in the infinite version of the set, because 4/5ths of infinity is meaningless.

Didn't quite follow all of that but it seems that what you're saying is that you're selecting a subset of the integers based on a range of their values. That's all fine and dandy for integers where there is a "correct" order that you can point to, but if instead of integers we had a set containing infinitely many monkeys and infinitely many typewriters, that doesn't work. Let's say each monkey and each typewriter are marked with unique integers, you could order them based on that:

M0, T0, M1, T1, M2, T2, M3, T3, M4, T4, M5, T5, M6, T6, M7, T7, M8, T8, M9, T9, M10, T10, M11, T11...

Or you might decide that since there is only one typewriter for every 10 monkeys, you should ignore the monkeys' last digit and order them thus:

M0, M1, M2, M3, M4, M5, M6, M7, M8, M9, T0, M10, M11, M12, M13, M14, M15, M16, M17, M18, M19, T1, M20, M21...

Which ordering is correct? Who knows? One would suggest that if you selected an object at random from the set, you've a 50% chance of getting a typewriter. The other suggests you have only a 1/11 chance.

Consider rational numbers, another countably infinite set. How would you order those?

Without a clearly "correct" ordering, I can't address a question like "what proportion of fractions (when simplified) have an even numerator?"

When we're talking about universes, we're probably talking about an uncountably infinite set, so ordering isn't possible anyway, and thus no finite range can be expanded to find a limit.

Link to comment
Share on other sites

  • 0

Didn't quite follow all of that but it seems that what you're saying is that you're selecting a subset of the integers based on a range of their values. That's all fine and dandy for integers where there is a "correct" order that you can point to, but if instead of integers we had a set containing infinitely many monkeys and infinitely many typewriters, that doesn't work. Let's say each monkey and each typewriter are marked with unique integers, you could order them based on that:

M0, T0, M1, T1, M2, T2, M3, T3, M4, T4, M5, T5, M6, T6, M7, T7, M8, T8, M9, T9, M10, T10, M11, T11...

Or you might decide that since there is only one typewriter for every 10 monkeys, you should ignore the monkeys' last digit and order them thus:

M0, M1, M2, M3, M4, M5, M6, M7, M8, M9, T0, M10, M11, M12, M13, M14, M15, M16, M17, M18, M19, T1, M20, M21...

Which ordering is correct? Who knows? One would suggest that if you selected an object at random from the set, you've a 50% chance of getting a typewriter. The other suggests you have only a 1/11 chance.

(Note: I later decide that this reasoning is flawed, but here's my initial response anyways:) I don't think it depends on the order. Your second ordering showing the ten monkeys for each typewriter is flawed. If both monkeys and typewriters can be assigned a positive integer, then both sets (monkeys and typewriters) are countably infinite and thus there are exactly the same amount of monkeys as typewriters. Therefore, there are NOT ten monkeys for every typewriter, as you said there were. So I think since they're both countably infinite, you'd be correct in saying that you have a 50% chance of getting a typewriter.

Consider rational numbers, another countably infinite set. How would you order those?

Without a clearly "correct" ordering, I can't address a question like "what proportion of fractions (when simplified) have an even numerator?"

Warning: I now disagree with what I just wrote, but you can see it anyways so you know what my thoughts are:

Again, I don't think ordering is necessary. And I think I can address that question: "what proportion of fractions (when simplified) have an even numerator?" The answer is half. I say this because the set of rational numbers is countably infinite, the set of rational numbers that have odd numerators is countably infinite, and the set of rational numbers that have even numerators is countably infinite. Because this means there are the same amount of rational numbers with odd numerators as there are rational numbers with even numerators, then the (answer to your question is that the) proportion of simplified fractions with an even numerator is half of all simplified fractions.

Actually, now that I write this, isn't it also true by the same reasoning that there are the same number of rational numbers with even numerators as there are rational numbers with any numerator? Yes, but then the proportions don't work out, so what I said in the paragraph above was wrong.

To say it another way, the set of all natural numbers, the set of all positive odd numbers, and the set of all prime numbers are all countably infinite. However, a randomly selected natural number has a 50% chance of being an odd number but about a ~0% chance of being prime! Oh no, this infinity business is hard to understand. So how does this probability work? How can you possibly determine the "portion" of elements of a certain kind in an infinite set when there isn't a clear "order" to the set, as octopuppy pointed out seems to be required. The proportion of prime numbers in the set of natural numbers seems to be 0% to me. As octopuppy said, I think I only know this because I know how the "order" works, and how prime numbers become increasingly rare in this order of natural numbers as the number of natural numbers increases.

So... not knowing the "order" of universes, how do we know if a certain kind of universe is an odd number or a prime number so to speak? Intuition tells me that the "simpler" universes are the prime numbers, i.e. they are very rare. I don't know what reasoning we have for thinking this still, though. This really is a subject where we can fall into poor reasoning pits. Hmmm....

Edited by Use the Force
Link to comment
Share on other sites

  • 0

In the case of universes the set is probably uncountable so order isn't even an applicable concept, but still the way we think about them might be deceptive.

A pretty silly example:

For every real number A in the range 0<A<0.9 there are infinitely many real numbers in the range 0.9 to 1 which can be generated by inserting a sequence of 9's to the start of the decimal expansion of A.

(so 0.6177388... maps to 0.96177388..., 0.996177388..., 0.9996177388... and so on)

So there are infinitely more real numbers between 0.9 and 1 than there are between 0 and 0.9.

Clearly that's absolute rubbish but the reasoning is analogous to that used with the universes. That kind of reasoning does not work here.

So I guess the question is, what kind of reasoning CAN we use here?

Link to comment
Share on other sites

  • 0

I forget if anyone made this point, so I'll make it here just to make sure.

I don't think it's possible to randomly select a natural number from the set of all natural numbers. I say this because any natural number in the set is finite and is thus inevitably way too small to be picked. For example, let's say we're picking a number from the set of natural numbers from 1 to 1*10^n. ~90% of the time we would expect the randomly selected natural number to have 1+10^(n-1) digits. ~99% of the time we would expect it to have 1*10^(n-2) digits. ... etc. As n goes to infinity we'd expect our randomly selected number to have ~infinity digits 99.999999% of the time. Uh, yeah. So, I don't think it's possible to choose a randomly selected element from an infinite set. Having said that, if I "pick" our universe from the infinite set of universes... umm, our universe is not prime, but it has a 50% chance of being even and a 50% chance of being odd. :)

Also, I just googled something and came across this:

http://answers.google.com/answers/threadview/id/599207.html

The guy who wrote the comment to the question says that because there is no probability distribution that makes sense for an infinite set, "one can't really randomly select a number".

Link to comment
Share on other sites

  • 0
I don't think it's possible to randomly select a natural number from the set of all natural numbers. I say this because any natural number in the set is finite and is thus inevitably way too small to be picked. For example, let's say we're picking a number from the set of natural numbers from 1 to 1*10^n. ~90% of the time we would expect the randomly selected natural number to have 1+10^(n-1) digits. ~99% of the time we would expect it to have 1*10^(n-2) digits. ... etc. As n goes to infinity we'd expect our randomly selected number to have ~infinity digits 99.999999% of the time. Uh, yeah. So, I don't think it's possible to choose a randomly selected element from an infinite set.
That's more or less the point I was driving at earlier when I suggested that unnecessarily complex universes were absurd because the number of unnecessary dimensions would be something like a randomly selected natural number. I'm still not sure if it's valid, but distribution seems to be the key here, and because the distribution of natural numbers is unbounded it gives us a problem.

Picking a real number between 0 and 1 is something we can at least think about doing. Sticking an imaginary pin at a random point along a line doesn't work physically but conceptually it's the equivalent of setting our quantum RNG to generate endless random bits and using these as the binary expansion of the number. Of course we never reach the end of the process but halve the interval in which our random point sits with every bit we generate. Problems to do with probability can sometimes be answered that way. Maybe we could do something similar for a range of rational numbers. We can devise a mapping from a range of rational numbers to natural numbers, but cannot do so in a way that preserves order, so a decreasing range of rational numbers maps to an infinite set of ranges of natural numbers, which isn't really useful.

When thinking about distribution, fractals spring to mind. This is just a mental image, BTW, not anything mathematically justified. Within the 2-dimensional space of a mandelbrot set visualisation you get a large space in which points clearly are in the set, a large space in which they are clearly not, and an infinitesimally small and complex border between the two. Maybe the more complex universes correspond to just such a fractal border, where tiny variations make the difference between a universe that supports life and one that doesn't. But the larger space of life-supporting universes is not on the border but inside it. That model doesn't really make too much sense when you think about it, but I'm just using it to suggest that complexity doesn't need to fill the space of the probability distribution.

Link to comment
Share on other sites

  • 0

That's more or less the point I was driving at earlier when I suggested that unnecessarily complex universes were absurd because the number of unnecessary dimensions would be something like a randomly selected natural number.

How often does a universe with a googolplex dimensions turn out as simple as ours? I'd guess 0% (in the same sense: what are the odds that a number with a googleplex digits is prime? Almost 0%). Thinking of number of dimensions as a measure of complexity, I come up with with this thought: Perhaps there's a distribution of possibilities for what the true complexity of a universe with self-aware humans like us is that looks something like a Normal curve with the x-axis being complexity. This Normal curve would say that it is highly unlikely that an extremely complex universe would contain beings as simple as us and it is highly unlikely that a simple universe would produce beings as complex as us.

Perhaps this idea is similar to your fractal border idea?

unreality: I read those two links. Prime numbers are indeed rare (0%) even though prime numbers and natural numbers have the same cardinality. It's because of the distribution. Good to know.

Link to comment
Share on other sites

  • 0
How often does a universe with a googolplex dimensions turn out as simple as ours? I'd guess 0% (in the same sense: what are the odds that a number with a googleplex digits is prime? Almost 0%). Thinking of number of dimensions as a measure of complexity, I come up with with this thought: Perhaps there's a distribution of possibilities for what the true complexity of a universe with self-aware humans like us is that looks something like a Normal curve with the x-axis being complexity. This Normal curve would say that it is highly unlikely that an extremely complex universe would contain beings as simple as us and it is highly unlikely that a simple universe would produce beings as complex as us.

Perhaps this idea is similar to your fractal border idea?

I think so. As far as I know, our universe could have as many dimensions as you like as long as the superfluous ones are small and inconsequential. The concept of probability distribution amongst mathematical models is something I just can't get my head around.

No time to read it all right now but I found this:

http://lesswrong.com...f_the_observed/

Looks like an interesting forum anyway

Link to comment
Share on other sites

  • 0

No time to read it all right now but I found this:

http://lesswrong.com...f_the_observed/

Looks like an interesting forum anyway

Looks like an intelligent forum also... I couldn't understand a lot of what was said (I only read part of it).

Regarding our discussion, one person wrote:

"Assume without loss of generality that each universe can be represented by a program in some Turing-complete language. Assume for the sake of argument that we are ignoring small programs and considering only large ones. Divide all programs into two categories:

1.

Those that produce regular output. (A large program may do this if, for example, its execution gets stuck in a small loop.)

2.

Those that produce pseudorandom output.

The ratio of the two categories (in the limit as size goes to infinity) depends on the language, but this doesn't matter, because the second category is unobservable from inside (a pseudorandom universe is unlikely to support life and certainly won't provide selection pressure for intelligence). Therefore we must observe our universe to be in the first category. This means the observed laws of physics must be simple (such as could have been generated by a small program), regardless of whether the "actual code" of the universe is small or large." ( http://lesswrong.com/lw/393/multiverse_and_complexity_of_laws_of_the_observed/33li )

I understand the point he is making, but I don't understand the details nearly enough to say I agree with him. Intuitively, however, what he says seems to make sense.

Also:

"Well, there are some possible anthropic drivers that come immediately to mind. Complex does not necessarily imply smart; it might be that large numbers of physical laws produce chaotic effects that forbid or strongly disadvantage the emergence of intelligent agents. And if there's anything to the simulation hypothesis, then universes with simpler laws would be easier to simulate, which could again skew the set of possible universes towards simplicity." ( http://lesswrong.com/lw/393/multiverse_and_complexity_of_laws_of_the_observed/33kq )

The "simulation hypothesis" part caught my attention. For example, if we could create (i.e. "simulate") a universe here on Earth, it would undoubtedly be quite simple. So how does this affect the probability that given universe is simple/complex.

Lastly:

"What's most significant about this question is that we've seen more than enough evidence to conclude that if we're at all generic observers, then universes must be weighted pretty directly by simplicity of the underlying mathematical structure.

Otherwise, as you point out, we'd be likely to be in a very labyrinthine mathematical structure; but most of those should not have the property that progress in physics leads you to consecutively (mathematically) simpler laws with more explanatory power. Instead, the things that make fire burn and the things that make plants grow should turn out to have nothing in common, etc..." ( http://lesswrong.com/lw/393/multiverse_and_complexity_of_laws_of_the_observed/35sw )

From this I like the idea that universes are "weighted pretty directly by simplicity of the underlying mathematical structure." The fact that the bulk of our equations for our current understanding of physics could be written down in a textbook means that our universe appears very simple relative to what it potentially could be (bazillions of chaotic laws that beings of human intelligence couldn't st all begin to understand).

Thinking of the latest in physics with all the quantum stuff and string theory that I don't understand, it's also a possibility that our universe that appears simple is in fact much more complex, but has some sort of pattern or whatever to the laws which allows us to observe something that appears simpler.

Link to comment
Share on other sites

  • 0

Thinking of the latest in physics with all the quantum stuff and string theory that I don't understand, it's also a possibility that our universe that appears simple is in fact much more complex, but has some sort of pattern or whatever to the laws which allows us to observe something that appears simpler.

This is more or less my viewpoint. We only approximate 'true physics' with our equations, which could be much simplified versions of the real thing, however accurate enough to a certain degree to be useful for us. An example would be using 1 + x + x^2 / 2 + x^3 / 6 to approximate e^x. It ignores an infinity of terms of the Taylor series but still is a very close approximation as long as x is inside a certain range.

We've seen other examples of this in physics throughout the centuries. Newton's simple laws of motion & gravity, for example, work for the kinds of numbers of speeds, distances, and times that we see in our everyday human lives on Earth, although are just approximations to the more complex "relativistic" equations.

It could be that we are forever simplifying true reality.

Link to comment
Share on other sites

  • 0

This is more or less my viewpoint. We only approximate 'true physics' with our equations, which could be much simplified versions of the real thing, however accurate enough to a certain degree to be useful for us. An example would be using 1 + x + x^2 / 2 + x^3 / 6 to approximate e^x. It ignores an infinity of terms of the Taylor series but still is a very close approximation as long as x is inside a certain range.

We've seen other examples of this in physics throughout the centuries. Newton's simple laws of motion & gravity, for example, work for the kinds of numbers of speeds, distances, and times that we see in our everyday human lives on Earth, although are just approximations to the more complex "relativistic" equations.

It could be that we are forever simplifying true reality.

Only someone who just learned Taylor series would use it as an example in such a circumstance :lol: Newton's laws are a much better example. But, yeah, this could definitely be true. It's not like there's a hidden set of physical laws that we are expecting to uncover. Rather, we're just trying to create equations/theories/etc that do the best job possible of describing what happens in our universe. Comparing physics to economics could be a good example of this. There are no inherent laws of economics that are built into the universe or our genes, but economists nevertheless create theories and equations to try to better understand and predict what we do economically. And also comparing physics and our universe to economics, in economics what we are actually doing is much much more complex than what the equations are that economists make to try to predict what we do in the market. So I agree with you, that our physics, like our economics, is likely just approximating what the true laws of physics make happen in our universe.

On a similar note, I wouldn't be surprised if we never found a grand unifying theory in physics like many physicists hope to find. One may simply not exist and it may indeed be true that the universe is much much more complex than it appears in the eyes of our current physics equations. It may also be that the structure of our universe is not in such a way that the simplified version we perceive of it isn't a perfect simplification... i.e. things behaving differently on small scale and large scale cannot be unified into one simple theory of everything due to the fact that they are just simple shadow approximation equations of our actual complex universe. Is what I'm saying clear? If not, how better could I communicate what it is that I'm trying to say :P ?

Link to comment
Share on other sites

  • 0

Only someone who just learned Taylor series would use it as an example in such a circumstance :lol:

Sorry, learned em a year or two ago. It was just the first thing that came to my head. :unsure:

And also comparing physics and our universe to economics, in economics what we are actually doing is much much more complex than what the equations are that economists make to try to predict what we do in the market. So I agree with you, that our physics, like our economics, is likely just approximating what the true laws of physics make happen in our universe.

The difference is that the discrete behavior of economics means that, with enough data and computing power, we could entirely encapsulate economics. You might then say that you could extend that to physics to, with "enough data and computing power", but because of Heisenberg & whatnot and the continuum vs discrete (we don't know yet but either way), that that's not possible for physics. So there's the fundamental difference

On a similar note, I wouldn't be surprised if we never found a grand unifying theory in physics like many physicists hope to find. One may simply not exist and it may indeed be true that the universe is much much more complex than it appears in the eyes of our current physics equations. It may also be that the structure of our universe is not in such a way that the simplified version we perceive of it isn't a perfect simplification... i.e. things behaving differently on small scale and large scale cannot be unified into one simple theory of everything due to the fact that they are just simple shadow approximation equations of our actual complex universe. Is what I'm saying clear? If not, how better could I communicate what it is that I'm trying to say :P ?

I understand you and agree. Although if we broaden both our estimation for "large things" and "small things" enough our simplifications might find some crossover room and we could come up with a unifying theory. That doesn't mean it's right, it's still just an infinitesimal approximation to reality, but unified in one instead of separate globes. I see no reason why we can't get slightly more complex to encompass the aspect of reality we know about (our "really small" (quantum) and "really big" (galaxies) might be NOTHING compared to the true scale/scope of the universe but nevertheless we'd think we'd unified it).

Link to comment
Share on other sites

  • 0

Sorry, learned em a year or two ago. It was just the first thing that came to my head. :unsure:

I have to say that despite the fact that I only learned Taylor series this year and despite knowing that I'm roughly a grade older than you, I strongly suspected when writing, "Only someone who just learned Taylor series would use it as an example in such a circumstance" that you had learned it a while ago and that it was just a random thought popping into your head.

The difference is that the discrete behavior of economics means that, with enough data and computing power, we could entirely encapsulate economics. You might then say that you could extend that to physics to, with "enough data and computing power", but because of Heisenberg & whatnot and the continuum vs discrete (we don't know yet but either way), that that's not possible for physics. So there's the fundamental difference

How would enough data and computing power allow us to totally encapsulate economics? The economists still wouldn't know for sure what type of car I was going to buy, etc. Sure, the economics could be a lot better, but no matter how much computing power you had it still wouldn't be perfect.

Even if the universe was deterministic, we wouldn't be able to compute the future from inside it. I think the same is true of economics. You can't perfectly predict what I'm going to buy either. So the fundamental difference you're pointing at is fuzzy; I don't see it.

I understand you and agree. Although if we broaden both our estimation for "large things" and "small things" enough our simplifications might find some crossover room and we could come up with a unifying theory. That doesn't mean it's right, it's still just an infinitesimal approximation to reality, but unified in one instead of separate globes. I see no reason why we can't get slightly more complex to encompass the aspect of reality we know about (our "really small" (quantum) and "really big" (galaxies) might be NOTHING compared to the true scale/scope of the universe but nevertheless we'd think we'd unified it).

I agree; that's definitely a possibility also.

Link to comment
Share on other sites

  • 0

How would enough data and computing power allow us to totally encapsulate economics? The economists still wouldn't know for sure what type of car I was going to buy, etc. Sure, the economics could be a lot better, but no matter how much computing power you had it still wouldn't be perfect.

Even if the universe was deterministic, we wouldn't be able to compute the future from inside it. I think the same is true of economics. You can't perfectly predict what I'm going to buy either. So the fundamental difference you're pointing at is fuzzy; I don't see it.

I guess I mean that with economics, we could model every piece of money in the universe, objects, where it is, has been, whatever. On second thought it wouldn't be very easy but that's why I said with enough "data and computing power". But then we wouldn't be able to predict the future with even that, of course (due to irrational human decision making), just model the present. We can't even model the present for physics, it's just fundamentally impossible. I guess there's not really that much of a distinction, especially if you isomorphically map them one-to-one (or at least good enough to preserve physic's uncertainty) by introducing the ability to purchase the naming rights to 'x' atom or 'y' galaxy.

So yeah you're right, it's very fuzzy. Forget the economics thing hahaha

Link to comment
Share on other sites

  • 0

How would enough data and computing power allow us to totally encapsulate economics? The economists still wouldn't know for sure what type of car I was going to buy, etc. Sure, the economics could be a lot better, but no matter how much computing power you had it still wouldn't be perfect.

Even if the universe was deterministic, we wouldn't be able to compute the future from inside it. I think the same is true of economics. You can't perfectly predict what I'm going to buy either. So the fundamental difference you're pointing at is fuzzy; I don't see it.

But if the world is deterministic, then that means that we could predict the future, given all the relevant prior facts. We could build a model of the Universe and watch it play out, if we had all the data points to start it. I think that that is in part what unreality is saying. It is possible to build mathematical models of human systems. The trouble is that the structure of human systems is subtle and often overlooked.

An economist couldn't know with 100% confidence what kind of car you might buy, but knowing enough information about you and your life, they might be able to create a prediction with a high probability of being correct. If every car your family has ever owned has been a Ford, chances are, you will buy a Ford. Knowing your income and age, they can predict what kind of Ford might be within your price range and interest. "He's young and on a tight budget. He'll probably be interested in a Ford Focus because of the fuel economy." There are entire careers built around predicting what people are going to buy. That's the business of Venture Capitalism, in a way. They want to invest their money in the next "Big Thing" that will sweep the world in a fad (or, they hope, much more).

So I agree with unreality. Given enough data points and enough computing power, I think it would be possible to predict what you or anyone else might decide to do in a given situation. Whether that is feasible to accomplish at this point in time is another matter. Of course with Google et al. data-mining every search you make and every thing you purchase online or with credit, it becomes easier to track your interests and attempt to engage your attention to products specifically catered to your buying/searching history.

I see unreality is giving up on his own argument...but it seems to me that if a system is deterministic, then it must be possible to build a model of it, given the relevant data. Can anyone think of a system that while deterministic, couldn't be modeled? :unsure:

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Answer this question...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...