(In response to questions on www.intuitionism.org) For a mathematician, and especially for an intuitionist, your questions are incredibly vague, and your "explanations" even more so. I'm sorry I won't be able to provide a string for you (or I can provide [#^10], but that would of course be cheating), but I'll try to explain how I feel about each item, and you can interpret those as answers.
The question can be regarded as mildly sensible, but the explanation is lunatic. By that thinking, no functions other than constants are total (because if we don't know what x is, we never know the value of the function), and if we change the rules so drastically, even constant functions aren't surely total, because if it is possible we don't know x, what makes us think that we know which constant we have?
The way IVT is normally stated, precision never enters the picture. The theorem speaks about two lines intersecting. And it is usually explicitly said that the statement is in no way constructive. Your idea of functions as computations to some precision is practically irrelevant for this matter.
Here again you (ok, Brouwer) change the rules so drastically that the question loses any trace of sense. If the power set of a given set might not exist as a finished whole, then obviously we aren't talking about sets (or "finished wholes" are special kind of sets? If so, please define them). Let's call these objects "xets". What makes us think that xets have cardinalities in the first place, and more importantly, that they should have the monopoly on using such terms? Theory of cardinality of sets is surely more rich and structurally interesting than the theory of cardinality of xets, and if one of them is to be called just "cardinality", I know which one I'll choose.
Another problem: however you define "finished wholes", I don't think they can really be infinite. Can you give me an example of "countably infinite" xet which is not "countably infinite unfinished"? I tried to find out more about this, but only Google result for "countably infinite unfinished" was your page.
Ok, this one is interesting, quasi-meaningful, and I might even be willing to answer '+', if it weren't for the word "definite" in the question. Yes, CH is meaningful. It holds in some well-understood models of ZF, and it has many interesting consequences (CH was one of themes of my Master thesis, so I know quite a bit about it), but your "definite" seems to me as if you want to fix one particular model now and forever. I don't think math works that way. Even when we someday agree whether CH is true or false, it still won't be "definite" in that sense - we reserve the right to change our opinion.
"Many people" consider intuitionism meaningless, also. :-] The very idea that inaccessible cardinals are "so big that they are completely irrelevant for all 'normal' mathematics" is itself very useful (in 'abnormal' - or better, 'meta-normal' - mathematics:), because it gives us the way to construct models of 'normal' mathematics (for example, ZF). I'd dare say that inaccessible cardinals wouldn't be nearly as interesting to study if they didn't have the property you stated as an argument for their meaninglessness.
(Ok, I suppose this one is a '+'. Spirit of set theory demands so. If no other theory will lend it a helping hand, it must appoint its own model builders.)
This is so ridiculous I don't know where to start. At a basic level, it is the same as the old riddle "which month has 28 days", with the intended answer "every one". I mean, to a question "is it easy to build a machine..." you answer "yes, just build two machines..." I'd like to see you as a real world machine builder for some factory with tight budget, interpreting your contract in such a way. :-] The essential part of "building a machine" (especially until you actually start physically building one, which I hope is not the issue here) is obviously designing it. If you fork every time a decision is needed, I don't think it can be called "building".
Also, if such a thing is allowed, I can build a machine with any behavior at all - I just build all possible machines. [I can even build a machine to decide which one of two starting machines is the right one, and then build it.:] The question is very uninteresting (to say the least) if interpreted that way.
Contrary to wishful thinking of some wannabe formal logic popularizers, there is no semantic-preserving isomorphism between propositional logic and natural language. In particular, "\rightarrow" is not the same as "implies", even to a classical logician (I'm sure you understand that intuitionistic "\rightarrow" is also not the same as "implies"). Nor is "\vee" the same as "or", it has nothing to do with implication itself. It is just an approximation, that is nice enough to do natural deduction, resolution, and some other stuff, but fundamentally, those are different things. I think no mathematician would interpret "imply" in such a way.
So explanation is meaningless. But the question? It is interesting, and might even have '+' as an answer. Personally, I don't believe that, but I can't give you a pair ((p,q),R), such that p and q are statements, and R is a transformation that takes a pair (i,S) (such that i is 0 and S is a transformation from a proof of p to a proof of q, or i is 1 and S is a transformation from a proof of q to a proof of p) and converts it to a proof of 0=1 - and I have a gut feeling that intuitionist will accept nothing less. ;-)
There is a word missing in your question. Before "gives". Is it "usually", "always", "sometimes"...? Constructive proofs are so different beasts than most classical ones, that it's not very hard to imagine two proofs, one classical and one constructive, such that the first one gives much more insight than the second one. In fact, it is trivially true if we remember that constructive proofs are also classical proofs. So if we interpret a question as "always", the answer is a '-'. If we interpret it as "sometimes", it is obviously '+'. And if we interpret it as "usually"... well, it depends on the measure on the space of proofs. :-) But I wouldn't say so. By your "analogy", would you really say that quantum mechanics explanation of some (classical) physical phenomenon gives more insight than classical mechanic one? Maybe only to a narrowly specialized quantum physicist.
(The "analogy" above is in quotes because the inclusion is in fact opposite: quantum mechanics embodies classical one - whatever you can explain by classical, you can explain by quantum. Whatever you can prove by classical logic, you won't necessarily be able to prove using intuitionistic one.)
It depends on the notion of reasoning. Your explanation makes sense (for a change:) on one level, but there is also the notion of reasoning that distinguishes mathematicians from physicists, philosophers, artists and mystics. It is the one of formulating precise and disclosed axioms, using precise rules of inference, and preserving certainty along the way. You can say that "doing mathematics" is broader than the above, but I still think that there is something in common to all reasoning in mathematics, that defines math as a discipline, even if it can't be easily formalized.
Another interesting question with totally bogus explanation. You can't really think that number of people who accept some mathematical claim is a sensible measure of its truthfulness?! If we think like that, even preserving truth can't survive - I think many more people accept AC than Banach-Tarski theorem. :-)
And the statement in Animal Farm was obviously sarcasm... it was precisely saying that animals were not equal. I don't think you'd like such argument to apply to your question. :-]
Sorry if it seems too harsh for you, and it is quite possible that I missed the point of your test completely. But you tried to give people some provocative ideas to think about, so I thought it would be fair to give you some provocative counterarguments to think about. :-)