Another Semester, Another Sh*t Show

Pages: 123
I was going to wait until traffic died down on this site, but I couldn't. University is like someone became mentally retarded and then decided everyone else needed to be mentally retarded too. A class I'm taking that fits that description even better is my proof class (mathematics). The most pretentious class on the face of the Earth EASILY. The way we're asked to prove things is in a way where it would be impossible to prove it had you not already known it was factual!

Everything I prove in that class I think I've perfectly proven. I was once proving that 5x is always even as long as x is even. A tool I used in my proof was saying that multiplication is applied addition, so 5x is simply x+x+x+x+x, and since an even plus an even is always even, 5x is even. But NO! Multiplication, that thing humans made up, cannot simply be assumed to be applied addition! That assumption is far too brazen!

The pretentiousness isn't even the worst to come out of that class. The professor tells us what we need to write on the test in order to get full points... AFTER THE TEST! And then the class average on the test was below 50%, what a joke. I failed that test. Had he said what he expected BEFORE giving the test, I calculated my grade on the test would have been a "C" at minimum. I aim for a "C" in my math classes because anything beyond that would require exponentially more study time and my GPA isn't greatly affected (I keep my scholarships). Several questions where my proofs were correct yet points taken off for not following the guidelines he never gave. It would have been hilarious if it didn't piss me off so much. No idea if he's planning on curving.


And assembly? Imagine a vaguely given lecture that you MIGHT understand had you already known assembly before stepping in. At the moment, my assignment is to turn a string into a float, and then put that float in a variable that's passed by reference into a register. So here I am, with a floating point in my XMM0 register, wondering how the hell he expects me to move it into the RDI register without ever explaining.

And the Assembly test? Points taken off practically for fun during the coding portion. There was no explanation, just a few red lines and points off everywhere - not that there was even sufficient time to finish the test at all. Professor never responds to emails. Teacher assistant are as helpful as a can of tuna with no can opener.


My computer science Algorithm class hasn't been too bad in my opinion. Though everyone else in that class seems to hate it. I won't know how I feel until after the test scores come in I suppose. I was the first done, didn't feel like it was too big a deal, but perhaps that's just because I have more coding experience, who knows.


And what else... my computer logic class? This one is about wiring, using chips, and all that good stuff. I like the concept of the class, but it's impossible to like going to it. It's early in the morning, there's plenty of students and a single teacher assistant and the professor disappears after giving the most dry and enlightening lecture of his career. Then we sit there trying to wire and use a broken computer program. However, the worst part is all the pre and post lab assignments we have. The most pointless things in existence, yet worth almost half our grade in the class. They take forever to finish, and by the time you finish them you wonder why anyone would have wanted you to make them at all.


I do have a class with a cool professor this semester though. Probably the only class keeping me sane. Thanks for reading or skimming. Maybe will university will get bombed with me in it and I'll know peace.
The way we're asked to prove things is in a way where it would be impossible to prove it had you not already known it was factual!

well, no, if you apply the techniques from the class to an incorrect statement, it WILL tell you that your premise was bad. Try it on something that is not true... do a quick proof that the sum of all pairs of integers is actually zero.

The way the professors act is another story... you win some, you lose some. I started my college at a high end engineering school where the classes were taught by ESL grad students who knew maybe 5 words of engrish. I quickly changed schools; I was not going to pay top $$ for grad students to jabber nonsense.

They take forever to finish, and by the time you finish them you wonder why anyone would have wanted you to make them at all.
Well, this is all labs in all disciplines. We never wired anything, it cost too much, it was all done virtually in a cad program / simulator. And still we were just building stuff like 1/2 adders or binary clocks or other boring stuff. The only thing to take away there is you make big stuff from little stuff, same as in coding, so knowing how to make the simple stuff actually is pretty useful.

All I have to offer is the words from a old song by an old hippie:
I spent 4 years prostrate to the higher mind, got my paper and I was free...




I'm with @jonnin here.

I find it unfortunate that students assume they go to college to be taught. Note that I didn't say learn, but to be taught.

Consider the Socratic method. The method involves presenting questions, but not answers. This assumes the student finds the answers.

It can be disorienting. I'm not certain I fully agree with it, though I recognize the value.

All of us way beyond our youth use it. We have no choice, really. We take a base of understanding cobbled together in our youth, but along our career we are faced with new language features, new languages, new operating systems...and then must learn how to make use of them on our own. This is the same thing, fully socratic in method. We have before us something we've not seen before, say C++ 17 (or, for many of us, C++11 through C++17), which is really quite foreign though based upon what we already understood from years before. We have no teachers. We only have questions.

Fortunately we also have the merciful writings of some really great authors to illuminate the path. We search, read, experiment and finally learn the new materials.

All that entirely without other supervision.

It may seem a joke, but in the modern view it seems to me the only way to proceed is to study the subject before the class begins, armed with applicable material which aids in coping with the onslaught of the course materials.

It is a bit of Zen, to assume that our expectation should be set aside because it promotes our discontent. Instead, we must assume the class is not where we will be given explanations, but only questions. We must find the explanations elsewhere, then apply them as best we can.

This is a stark contrast to the elementary educational experience we had before college. There the population is an unwilling mob. We can't hope for most of them to care. They are obligated to be there by law, and most don't care one bit how they get out of it. The main difference, it may be expected, for college is that the students have opted to be there, and as such are willing participants.

This is no comfort, I know. Students are being left to their own wits, and many will fail as a result.

I think it is worse in this particular study than many others. The pace of technological change surpasses the ability of curriculum to follow. The "teachers" may, themselves, have little motivation to study emerging materials and therefore have limited competence. In this particular field I sense it works better if the students are far ahead, studying on their own, treating the classes as an interruption only as a means of earning the degree. Unfortunately this means they would have to begin as freshmen in high school to be on pace. Some have, and it was so in my time (the late 70's and early 80's). I entered knowing the material, so the courses were minor adjustments to what I already knew full well. I was among the very first generation where a computer could be purchased in a store. Before my teen years that simply did not exist. By the time I got to college I had been a "programmer", of sorts, for several years. I had no choice but to build my first computer, back when that meant soldering a few thousand leads onto the motherboards (there were a lot more chips, each with a tiny fraction of the transistors in them). These were lots of those 74LS series IC's used even today to teach early digital electronics (they are classics - gates, typically).

Once you are already in the pipeline (freshmen, sophomore) in college, it seems to me you're already being hit with the unfortunate reality that you've been thrust into an arena with challenges, questions, problems to solve, but very little in the form of explanation and disclosure.

I don't know of any other way out. There are texts and papers, blogs and discussions on all of it. It does seem, though, like "going to a second school" while attending the one that offers the degree you require.


Niccolo wrote:
We have before us something we've not seen before, say C++ 17 (or, for many of us, C++11 through C++17), which is really quite foreign though based upon what we already understood from years before. We have no teachers. We only have questions.
I began trying to learn C++ before there was C++03. Before there was even C++98 IIRC. Self-teaching from books. Many badly written. Using antiquated tools only a half-step up from stone knives and bear skins.

I had nothing but questions, trying to answer them when web resources were rudimentary at best, as well as being over-complicated to the point of incomprehensibility.

As I got older and the C++ language matured with C++11 self-teaching has gotten to be less of a head-ache. From prolonged familiarity probably. Learning to program is a hobby, not a profession or a job.

Learning Assembly would be more than I would care to delve into as a hobby. Learning C++ and Win32 is about my limits.

I still have plenty of questions, there are more resources now to satiate my curiosity.

I have never expected to be taught, I am not a mushroom demanding to be force-fed the knowledge as I sit passively vegetating in a lecture hall or class-room.

MAKE me want to find the answers. Light my path.

</rambling off>
zapshe wrote:
I was going to wait until traffic died down on this site, but I couldn't. University is like
Why wait? The Lounge is a great place to have a rant about life's trials and tribulations.

It has been decades since I subjected myself to formal education, and I don't miss the experience one bit.

My desire to learn programming pushes me to spend my time investigating what interests me, not what some "expert" deemed was the "correct method" to learn C++.

I know some of the practical side of programming, the nuts and bolts. Theory of how the language is put together at the machine code level I find boring.

Assembly is more than I care to know. And likely more than I can know and understand.

At least I don't have to spend a lot of money to learn C++ the way I want, at my own pace.
well, no, if you apply the techniques from the class to an incorrect statement, it WILL tell you that your premise was bad

My issue with proofs is that it requires you to show something beyond^infinity a reasonable doubt. There's plenty of problems that seem to require you to already understand why it is/isn't true before you can even start the proof - which kind of defeats the purpose.

Thanks for your experience, so far only a single professor whose English was unbearable, that was last year.


@Niccolo

Everything sounds about right. During grade school, I was reluctant to care, I didn't opt to be there. Even so, I got good grades, it felt impossible to fail. With programming, I knew a degree was needed these days to get a good position, many places won't even look at a resume without a degree. Even though I opted to go to college, it just feels like another adventure of stuff I didn't want to do. It's also hard to teach myself like I use to when my free time gets eaten up with pointless assignments. For my assembly class, it's like they know they haven't taught us enough to code the assignment so they actually code a nice chunk of it for you to get you started.


Why wait? The Lounge is a great place to have a rant about life's trials and tribulations.

Just since people were probably busy helping others with questions. But I suppose you're right!


Assembly is more than I care to know. And likely more than I can know and understand.

Assembly feels like regular coding, only with more steps. It was actually a bit fun to learn at first, but at this point we're beating a dead horse. Why would I ever need to link a C++ program to an assembly program where I pass 8 arguments and have to access some of them through the stack that never seems to have the memory address?!



Thanks for the replies, I enjoyed reading them!
to show something beyond^infinity a reasonable doubt.
its called a proof, not a hypothesis :P if its just a hypo, you can have reasonable doubt or assurance that you know based off evidence, but for a proof, they want it formally beat to death, yes.

I am not sure how important assembly is anymore. I feel it helps understand what really happens in the chip with code, which I think helps once in a rare while when coding at a higher level (eg, you should now understand why branch prediction is nice but that it has limitations or why jumps can disrupt the pipeline) esp if you are trying to get just a little more speed in an inner loop.
What's interesting about proofs is that they're just compelling arguments in support of statements. There's no general algorithm to show that an arbitrary proof is valid; at best you can show that a proof contains no self-contradictions. Can you really tell whether the statement is actually true or whether a deceptive proof has fooled you? For example, there have been many proofs for the four color theorem that were incorrectly believed to be valid for many years. How many false conjectures are currently believed to have been proven true?
What's interesting about proofs is that they're just compelling arguments in support of statements. There's no general algorithm to show that an arbitrary proof is valid; at best you can show that a proof contains no self-contradictions.

Same exact thought went through my mind!

A proof is a construct that wants something proven beyond a reasonable doubt in it's own flawed way. You can't know in the proof whether or not you've missed an important variable or aspect until it's found out. Constructing a proof is logical, but proofs themselves are so pretentious that it almost becomes illogical at some point depending on the proof. Some things don't make sense to have to or even try to prove by the standards of these mathematical proofs. And again, it almost seems like you have to already know the answer before you can prove it!
Last edited on
A proof is a construct that wants something proven beyond a reasonable doubt
No, "beyond reasonable doubt" is an insufficient standard of evidence for mathematical proofs. If there's any doubt whatsoever that the conjecture is true then the proof has failed. This is why the Collatz conjecture is not considered proven, even though it has been shown to hold for very large numbers. The "reasonable" part of reasonable doubt assumes some kind of common sense. For example, a person can normally be assumed not to have transported themselves at the speed of sound between where they were seen at one time and where a crime was committed somewhere else. Mathematics is a heavily unintuitive realm where common sense is useless.

Constructing a proof is logical, but proofs themselves are so pretentious that it almost becomes illogical at some point depending on the proof.
On the face of it, I don't agree with this. What exactly do you mean? If something sounds illogical or nonsensical it's usually a very bad sign.

And again, it almost seems like you have to already know the answer before you can prove it!
Sometimes that's true, to an extent. There are some stupidly obvious statements that are very difficult to prove. Other times I've found myself trying to prove a statement to see if it was true.

For example, one proof is to show that a line and a circle don't meet (given the equations). It should be as simple as drawing the line and circle on a graph and seeing they don't intersect or touch, but that wouldn't "prove" it.
The thing is that a graphical proof can only prove some cases. If the line and the circle are really far apart then they obviously don't touch, and if the line passes through the center then they obviously touch, but if they appear tangent on the drawing then you have to do something else, because you can't tell if they really are tangent or if your drawing has insufficient precision.
If you look for solutions to a system of equations you always get an exact answer.

Something similar happens when looking for limits on infinity. Testing big numbers on a calculator is often useful to get a general idea, but it doesn't replace proving the limit.
Last edited on
If there's any doubt whatsoever that the conjecture is true then the proof has failed.

It MUST be reasonable. You can't say you doubt a proof because it uses too many words, for example. That's not a reasonable doubt. You can't say that the reasonable doubt is that trying to prove something has caused a reaction in which makes every brain trying to prove it prove it wrongly and everyone who reads the proof is incapable of seeing the error. This is beyond reasonable doubt to just conjecture with no basis - so not every doubt would make a proof fail, the standard is that it's reasonable by some agreed standard.

Mathematics is a heavily unintuitive realm where common sense is useless.

I do get this point, but it only holds true when looking at special cases. Common sense breaks down at the speed of light, a black hole's gravity, quantum physics, etc.. But most things will abide by common sense standards. For example, algebra is basically common sense in mathematical form.


On the face of it, I don't agree with this. What exactly do you mean? If something sounds illogical or nonsensical it's usually a very bad sign.

In class, we've had problems where you'd need to already know why something is true in order to construct the proof for it. This already makes it useless because it means you can't make the proof without already knowing by ANOTHER standard why it's true.

Next, to prove this thing is true, you can't just show that it is. For example, you can't simply show that a line and a circle don't intersect on a graph and say you've proved it, you must show logically that it's true rather than presenting the answer - pretty illogical.

Many times, you'll construct a proof and anyone looking at the proof would not be able to know what the hell you were doing unless they had already known how to prove said thing. Moreover, what kind of standard is being used to say whether or not something was proven? You say it must be a proof beyond all doubt, but I think that standard is impossible. Clearly, if ANY proof has ever been found to be incorrect in the future, then there was a doubt that no one had. Therefore, the biggest flaw of any proof is that HUMANS make proofs and not Gods, makes the proofs pointless. You can always discover another variable that you couldn't have known about when constructing the proof.

I'll again say that Multiplication is a human construct, and is simply applied addition. Saying 5x is the same as x + x + x + x +x was something the professor left a comment on saying I couldn't do. By what standard? If I can't use something known (known because people made it up!), then how could any of the proofs in class where he creates random variables to try and prove something work? I could easily object by saying that adding a variable to try and construct the proof may alter the logic of the proof, so you can't do it. It wouldn't even make sense.

There are some stupidly obvious statements that are very difficult to prove.

Yes, but it's almost pointless. The GCD between a prime and any other number is going to be either 1 or the prime number itself. A proof asked to prove that if p doesn't divide a, then the GCD is always 1. I got points taken off from my test because I proved that if p != a, then the only GCD left to choose from is 1. The issue was that I didn't prove what if a = p. I went up to the TA and said that it's the same exact thing, if p = a, then it doesn't matter, I wasn't trying to prove anything other than if p != a. He said I had to prove it the other way around, or something stupid to that effect. It was pure mental retardation I was dealing with.

Other times I've found myself trying to prove a statement to see if it was true.

The difference here is that you're going to prove it by your standards I'd assume. It wouldn't be something that would have to survive a professor.


The thing is that a graphical proof can only prove some cases.

It was clear that the line and circle didn't touch. My professor says it can't "prove" it because it just gives you the answer - however that works. We're working with sets now, and he says you can't show that set A is in set B using a venn diagram, which would clearly show whether two sets where the same or shared common elements.

Something similar happens when looking for limits on infinity. Testing big numbers on a calculator is often useful to get a general idea, but it doesn't replace proving the limit.

I'd say the difference is that a graph won't actually tell you the answer, it only hints at it. It's like a "gotcha!" scenario.
It MUST be reasonable. You can't say you doubt a proof because it uses too many words, for example. That's not a reasonable doubt. You can't say that the reasonable doubt is that trying to prove something has caused a reaction in which makes every brain trying to prove it prove it wrongly and everyone who reads the proof is incapable of seeing the error. This is beyond reasonable doubt to just conjecture with no basis - so not every doubt would make a proof fail, the standard is that it's reasonable by some agreed standard.
That's not what the phrase "reasonable doubt" refers to.

Consider the Collatz conjecture. A proof beyond reasonable doubt might argue "the Collatz conjecture has been shown to hold up to 2^256 - 1, therefore we can conclude that it is very probably true for all naturals". The rationale behind such an argument would be that in everyday human experience, properties that hold for a significant portion of a set usually also hold for the whole set. A "reasonable person" would accept that argument because it matches their expectations.
Needless to say, such an argument would not be rigorous enough for mathematics.

I do get this point, but it only holds true when looking at special cases. Common sense breaks down at the speed of light, a black hole's gravity, quantum physics, etc.. But most things will abide by common sense standards. For example, algebra is basically common sense in mathematical form.
You're abusing the phrase. What you mean to say is that you've done simple algebra so much that you can do it easily. You can't apply common sense to algebra because it's entirely removed from your everyday experience. At no point will you hold a system of linear equations in your hand.

This already makes it useless because it means you can't make the proof without already knowing by ANOTHER standard why it's true.
This is purely a pedagogical problem. I assure you if the teacher were to start from first principles and then proceed to given nothing but proofs, you would leave the class having understood nothing new. Simply put, proofs are a mathematical tool, not a teaching tool. You won't learn, say, how to solve Diophantine equations by studying the proof of the Chinese remainder theorem.

Many times, you'll construct a proof and anyone looking at the proof would not be able to know what the hell you were doing unless they had already known how to prove said thing.
A proof doesn't need to show how the mathematician figured out how to prove the conjecture. It just has to make a case for the truth of the conjecture. Likewise, the person reading the proof doesn't need to understand how the proof was arrived at. As long as the premises are true and the argument is valid, the proof is successful.

I'll again say that Multiplication is a human construct, and is simply applied addition. Saying 5x is the same as x + x + x + x +x was something the professor left a comment on saying I couldn't do. By what standard?
It depends on the axiomatic system being used. In a field (such as the reals) multiplication is not just iterated addition, but is a fundamental operation. Check the field axioms and you'll see that multiplication is not defined in terms of addition. As a simple example, what iterated addition is equivalent to pi * e?

A proof asked to prove that if p doesn't divide a, then the GCD is always 1. I got points taken off from my test because I proved that if p != a, then the only GCD left to choose from is 1. The issue was that I didn't prove what if a = p. I went up to the TA and said that it's the same exact thing, if p = a, then it doesn't matter, I wasn't trying to prove anything other than if p != a. He said I had to prove it the other way around, or something stupid to that effect. It was pure mental retardation I was dealing with.
Hmm... I have to say I agree with the TA. When you introduce extra assumptions you need to check what happens when those assumptions are false, to ensure that the final result doesn't become invalid. All you needed to say was this:

If p = a then p divides a, since p divides p, and that contradicts the hypothesis, therefore necessarily p != a.
If p != a (rest of the proof)

and you could have avoided the issue.
That's not what the phrase "reasonable doubt" refers to.

It's simply means "a doubt especially about the guilt of a criminal defendant that arises or remains upon fair and thorough consideration of the evidence or lack thereof" from Webster Dictionary. Maybe it has a different mathematical term that I'm not aware?

What you mean to say is that you've done simple algebra so much that you can do it easily.

No, basic algebra is simple common sense. For example, if you made a deal with someone that whenever you let them borrow money, they'd return you the money doubled, then it's common sense that if you let them borrow X amount, that they should return 2X. This can be turned into a simple algebraic equation, but the whole thing was already common sense beforehand.

When I was in any Algebra class, I never felt like I was learning, it just felt like simple logic put in mathematical terms. This isn't to say all of algebra is this way, some of it is beyond common sense. However, the basics of algebra is deep rooted in common sense.

You can't apply common sense to algebra because it's entirely removed from your everyday experience.

This is entirely false in the way you mean it. All of middle school and high school was a bunch of word problems that were algebra in disguise. Algebra as a subject is completely mathematical and doesn't care about common sense, but the opposite isn't true. Parts of common sense and the entire world is based upon mathematical principles.

At no point will you hold a system of linear equations in your hand.

At no point will you ever hold a tangible representation of the English language in your hands. You can't hold common sense in your hands either. It's not a very impressive argument. You can hold something which can be represented by a linear equation, but how can the fact that you can't hold an idea in your hands prove that algebra is removed from everyday experience? Algebra as a concept doesn't care about the world, it obviously can't, but it was created because it can REPRESENT things in the world - the same as common sense.


I assure you if the teacher were to start from first principles and then proceed to given nothing but proofs, you would leave the class having understood nothing new. Simply put, proofs are a mathematical tool, not a teaching tool.

If a proof cannot present new information, what's the point? That's my hatred with them, they serve no purpose except for maybe a few rare cases I could imagine. This is especially true when all a proof does is use what's already known to create a statement which is fundamentally true, and therefore can be wrong since it may not take into account a variable that has not yet been discovered.


A proof doesn't need to show how the mathematician figured out how to prove the conjecture.

Another reason why it's useless. Imagine telling someone X is true, and when they ask why you give them an explanation that'll go over their heads unless they already had the knowledge which would have made them not ask to begin with.

Likewise, the person reading the proof doesn't need to understand how the proof was arrived at. As long as the premises are true and the argument is valid, the proof is successful.

That's not logical. If the person reading doesn't know where the proof was derived from, then they've completely missed out on how some of the premises were reached, in which case how can they verify that the premises were true? In simple proofs, the premises are easy to verify, but in more complex ones, it looks like they pulled things out of thin air and you can't verify it without trying the proof yourself (and have sufficient knowledge to do so).

It depends on the axiomatic system being used

With the assumptions made and the conclusion trying to be proved, it was very clear.

As a simple example, what iterated addition is equivalent to pi * e?

The proof restricted us to whole numbers, otherwise even and odd wouldn't be possible. And in that situation, it's simply applied addition. You could argue that pi * e is applied addition. You'd have pi and e lined up to as many digits as you want to calculate to, then whenever you'd of multiplied two numbers, simply add X amount of times. you could write a For-Loop to accomplish this. Then you'd of achieved multiplication with only addition. Though this is a kind of a hack I suppose.


When you introduce extra assumptions you need to check what happens when those assumptions are false, to ensure that the final result doesn't become invalid

What?? I didn't bring in any extra assumptions. The question said prove that if P is prime and p doesn't divide a, then the GCD is 1. If P doesn't divide A, then how can A = P? I used no new assumptions, it was a direct proof. If P doesn't divide A, that means the only denominator left is 1. P can only divide A if P is < or = A, and it doesn't divide A. There shouldn't be anything else needed. By the very definition of a prime number, the proof should have been done.


you could have avoided the issue.

There shouldn't have been an issue. I can't predict mental retardation, there's no thought process to predict. What's a valid answer today won't be tomorrow with these professors and TAs.
A mathematical proof is not a proof 'beyond reasonable doubt'. Saying that is absurd. A mathematical proof is not determined as in a legal case. There is absolutely no doubt whatsoever when a mathematical proof is made or derived.

Multiplication is not fundamental to what is commonly termed number theory or more specifically number theory based on Peano's Axioms - multiplication is derivative.

And a proof doesn't present new information other than proving the (already made) proposition.
Everything I prove in that class I think I've perfectly proven. I was once proving that 5x is always even as long as x is even.
Unfortunately you have misled yourself by making an assumption that may or may not be true.

The 'proof' is only half true and never addresses what an even (or odd) number is.

The approach is not unreasonable - creative even. The lack of rigor though in not pinning the implicit (and explicit) assumptions down is unreasonable. I reckon, as it stands, it's worth about 50% tops, 40% more likely.
It's simply means "a doubt especially about the guilt of a criminal defendant that arises or remains upon fair and thorough consideration of the evidence or lack thereof" from Webster Dictionary. Maybe it has a different mathematical term that I'm not aware?
"Reasonable doubt" is simply a standard of evidence.
https://en.wikipedia.org/wiki/Burden_of_proof_(law)#Legal_standards_for_burden_of_proof
None of those standards are sufficient in mathematics.

For example, if you made a deal with someone that whenever you let them borrow money, they'd return you the money doubled, then it's common sense that if you let them borrow X amount, that they should return 2X. This can be turned into a simple algebraic equation, but the whole thing was already common sense beforehand.
Now give an example with complex numbers and polinomials.

This is entirely false in the way you mean it. All of middle school and high school was a bunch of word problems that were algebra in disguise. Algebra as a subject is completely mathematical and doesn't care about common sense, but the opposite isn't true. Parts of common sense and the entire world is based upon mathematical principles.
What I said was that mathematics is removed from everyday experience, not that everyday experience is unrelated to mathematics. Obviously everyday experience can be modelled in terms of mathematics. That's why it's useful. However, mathematics is inherently abstract and unintuitive. That's why most people don't like it, because it requires serious effort to grasp, and you can't fudge it. If you screw up it's completely obvious.

If a proof cannot present new information, what's the point? [...] Another reason why it's useless. Imagine telling someone X is true, and when they ask why you give them an explanation that'll go over their heads unless they already had the knowledge which would have made them not ask to begin with.
The point is to show that a statement is unquestionably true, ideally in as few steps as possible (a proof that's too long would not be comprehensible). That's all.
A proof is not an explanation. It doesn't need to show you why the statement is true, it just needs to show that it's true. In some cases a proof may show that something has some property as a direct result of some other fact, and if that fact wasn't true the property wouldn't hold, but much more often there's no obvious explanation.
https://www.smbc-comics.com/comic/2010-06-20

This is especially true when all a proof does is use what's already known to create a statement which is fundamentally true, and therefore can be wrong since it may not take into account a variable that has not yet been discovered.
For example?

That's not logical. If the person reading doesn't know where the proof was derived from, then they've completely missed out on how some of the premises were reached, in which case how can they verify that the premises were true?
Because some problems are hard to solve, but their solutions are easy to check.
https://en.wikipedia.org/wiki/NP-completeness
Suppose I crack your RSA private key. In order to show you that I've done it, I don't need you to do all the computations I've done. All I need to do is give you the private key and have you compare it to yours.

What?? I didn't bring in any extra assumptions. The question said prove that if P is prime and p doesn't divide a, then the GCD is 1. If P doesn't divide A, then how can A = P?
The teacher needed to know that you understood that, and the way to do that was to write the reasoning I wrote above. If you just write "p != a" without any explanation, how can they know how you arrived at that conclusion? Yes, it's true that p not dividing a implies that p != a, but there's a specific reason why that implication holds, and infinitely many wrong reasons.
For example?
Happens in physics frequently. We have an equation we know should model something, but its not quite giving the exact expected observed value, therefore there must be something we did not account for... go back and discover what that is ... oh, there is yet another subatomic particle in play (or if you go way, way back, it was air resist/friction / rotation of the earth / whatever). But the math tells you that you failed to account for something, even if you don't know what it was.

I thought we were talking about mathematics. Proofs are for sure used in theoretical physics, but those have a different degree of immutability as mathematical proofs. Obviously if your proof assumes something and that turns out to be false, the proof becomes vacuous (valid but false), but still true in the chosen axiomatic system. Has something like that happened in mathematics?
Last edited on
The 'proof' is only half true and never addresses what an even (or odd) number is.

I didn't put my whole proof here. I got full points for the proof, it was just a comment on my paper regarding the 5x thing. I lost points for not following the guidelines he never gave until after the test.

Now give an example with complex numbers and polinomials.

Not that I can't, but they wouldn't be as intuitive, again, talking about basic algebra.

What I said was that mathematics is removed from everyday experience, not that everyday experience is unrelated to mathematics

Which is exactly why your statement doesn't disprove what I said about algebra being mostly common sense. Not because Algebra IS common sense, but that common sense has a basis in basic algebra.


A proof is not an explanation

A proof is an explanation in the way that it starts with a premise and proves something. All the lines of the proof are a mathematical explanation which may or may not explain the fundamental basis of why something is true/false. For example, a direct proof will likely explain why something is actually true, why a proof by contradiction will only explain why something MUST be true.

For example?

You yourself stated the proofs that were incorrectly believed:

https://mathoverflow.net/questions/35468/widely-accepted-mathematical-results-that-were-later-shown-to-be-wrong

Why were they believed? A case wasn't found that contradicted it till later - it was something not taken into account. Proofs are pointless in that they only reflect what's already known. It's almost like that one annoying guy who repeats things after someone else figures them out because it just clicked for him, what's the point?


Because some problems are hard to solve, but their solutions are easy to check.

If that's all you needed, then the proof itself was pointless because you likely required to proof something already known. I could easily write a whole bunch of none sense on a test and say, "It's too complex for you, but just look at the final result, it's clearly true." How can you validate that the proof came to the conclusion without a logical fallacy if you can't verify the steps?


If you just write "p != a" without any explanation, how can they know how you arrived at that conclusion? Yes, it's true that p not dividing a implies that p != a, but there's a specific reason why that implication holds, and infinitely many wrong reasons.

Again, I didn't put the entire proof on here. But if you want it, it went like this:

"Since P divides itself and is prime, meaning the only numbers that divide P are P and 1, if P doesn't divide A, then the only other number that can divide P and A is 1, hence GCD(P,A) is 1 only if P doesn't divide A."

When I took it to the TA, he admitted he misread it and should have given more points, but made the argument that I didn't fully prove it. Looking on the problem now, it says to proof that the GCD is 1 ONLY IF P doesn't divide A. Even though it's crazy obvious and completely implied that if P divides A then the GCD will be P, I didn't "implicitly" state what the outcome would be had P = A. The problem was out of 10 points and I received 1 point. 5 points off were due to not following the proof guidelines he didn't give us, and then 4 off because he he misread what I wrote. The whole fact that he didn't tell us what he expected on the test (which was worth 5 out of 10 points) meant that it was impossible to get over 50% on the test. And just as you'd imagine, the average was below 50%.

I don't know why you feel you have to defend proofs, they're pointless. If I need to prove something, I'd say MY standards are much more reasonable. For example, I won't assume I've proven something just because I made a non-contradictory mathematical expression ;) To prove something truly, you require a higher degree of certainty, and it doesn't even require all the pretentiousness from this class.
A proof is an explanation in the way that it starts with a premise and proves something. All the lines of the proof are a mathematical explanation which may or may not explain the fundamental basis of why something is true/false. For example, a direct proof will likely explain why something is actually true, why a proof by contradiction will only explain why something MUST be true.
No. Again, a proof is just an argument in favor of a proposition. There's a difference between explaning why something is true and showing that it's true. A proof must do the latter, but doesn't necessarily do the former.

You yourself stated the proofs that were incorrectly believed [...] Why were they believed?
They were believed because the people who read them didn't fully understand them. It's not like the proofs were correct, and then something new was discovered that rendered them incorrect. They were never correct, they were believed to be correct, and then someone found counterexamples that definitely showed flaws in the reasoning. But if a proof is actually correct then there's no way to overturn it.

If that's all you needed, then the proof itself was pointless because you likely required to proof something already known. I could easily write a whole bunch of none sense on a test and say, "It's too complex for you, but just look at the final result, it's clearly true." How can you validate that the proof came to the conclusion without a logical fallacy if you can't verify the steps?
You misunderstood. The problem is "how do I prove P?" The solution is the proof of P. You can check the solution by following each step in the reasonining of the proof looking for fallacies, but you don't need to prove P yourself to check the proof.

"Since P divides itself and is prime, meaning the only numbers that divide P are P and 1, if P doesn't divide A, then the only other number that can divide P and A is 1, hence GCD(P,A) is 1 only if P doesn't divide A."
I think this is a valid proof, but you could have written it a bit clearer. Try not to jam all your steps into the same sentence. E.g.
1. Since p is prime, its divisors are 1 and p.
2. Since p doesn't divide a, p is not in the divisors of a.
3. From #2, the intersection of the divisors of p and the divisors of a is 1.
4. Since GCD(p, a) is the maximum of the intersection of the divisors of p and the divisors of a, GCD(p, a) = 1. QED.

I don't know why you feel you have to defend proofs, they're pointless.
I'm sorry you don't see their value, but they're certainly not pointless. Even for non-mathematicians, they're good training to express ideas clearly and to exercise mathematical rigor.
Pages: 123