Thursday, October 30, 2014

Exam 2 Review, Due October 31

  • Which topics and theorems do you think are the most important out of those we have studied?
Principle of Mathematical Induction and Strong Principle of Mathematical Induction
Reflexive, symmetric, and transitive properties
Theorem: [a]=[b] if and only if aRb
Functions- Domain, codomain, range, surjective, injective, bijective, composition
  • What kinds of questions do you expect to see on the exam?
Proof by MPI/SPMI.
Knowing the properties of relations
Prove a relation is an equivalence relation
Proving functions are bijective
  • What do you need to work on understanding better before the exam? 
Reminder of SPMI problems
Equivalence class problems 
Well-defined vs Well-ordered
(I've got the hang of recent chapters, it's just about getting those rusty gears from chapter 6 and 8 moving again!)

Monday, October 27, 2014

9.6-9.7, due on October 29

Difficult: Whoa, big concept chapters! A lot to take in! First off, the proof that inverse functions are functions if and only if the functions are bijective was a lot to take in. If we're going to need to prove that on our own, it would be helpful to review it again in class. Also, are rational and linear the only functions we can easily find inverses of? It mentioned we can't find inverses for some polynomial functions, but are there any other ones we can do?

For permutations, do we always use natural numbers that come right in a row? Or could me use B= {2,5,8}, or even B={a,b,c} to find permutations? Also, what do we do with them...? Or just look at them since is it just a concept for now?

Finally, at the end it said "permutations satisfy the the properties of closure." What is a property of closure?

Interesting: Well, the one thing I do like about permutations is that they are easier to write than listing out all the commas and parentheses. I also thought it was interesting how we can make all functions inverses by cutting off the codomain that we don't use, but I guess that's kind of like what we did for the inverse trig functions.

Saturday, October 25, 2014

9.5, Due October 27

Difficult: For some reason, it always throws me off that g ∘ f is g(f(x)), since I always think that the letter that comes first should be the function we do first, so that kept throwing me off but I think I've got it figured out now.

Also, it started talking about how ran A only needed to be a subset of B' for (g∘ f):A--> C to be defined. Well B' does NOT mean the derivative of B, correct? Because that would be MEGA confusing.

Interesting: At first, this concept was totally tricky. However, it made a lot more sense with the nice picture with circles and arrows, along with the examples of finite sets. I also never knew that composition could be associative, yet not commutative. I thought that was fancy!

Thursday, October 23, 2014

9.3-9.4, Due October 24

Difficult: Alright, now I've got a lot of concepts floating around in my head that I need to work on solidifying. So we have three main special relations that we've learned: one-to-one, onto, and bijective. I understand most of them visually, like on a graph, but I need to understand them conceptually now. So, if we have to prove something is bijective, we must prove it is both one-to-one and onto? Several examples of proving these two proofs would be helpful.

I didn't understand completely about why theorem 9.4 is false for infinite sets, but I assume we don't need to completely understand that yet, since we haven't discussed infinite sets with these properties yet.

Interesting: It's interesting thinking about these properties with finite sets. I can always picture a graph on the real Cartesian plane to fulfill the properties, but when we start talking about equivalence sets, it's more difficult to picture/conceptualize, but easier to prove.

Tuesday, October 21, 2014

9.1-9.2, Due October 22

Difficult: So let me make sure I have this correct, the difference between codomain and range is that the codomain is all possible values of y, while the range is the only ones that get used? What does the symbol that is an o with a vertical slash through it mean? I just think it looks nice, but does it actually mean something specific? Also, if B={1,2}, would it be bad to represent the set of all functions from A to B by 2^A? (Since B is supposed to be {0,1} according to the last line of chapter 9.2)

Interesting: There is definitely a lot of interesting notation/vocabulary used and learned, such as image and mapping. I also thought the evolving definition of the word "function" over time was interesting as well, though the really old ones are really quite abstract and difficult to understand.

Saturday, October 18, 2014

8.6, Due October 20

Difficult: So this all sounds great and dandy, I'm just not so sure about when sums or products are not well-defined. I mean, I guess the example in the book at the end of the chapter just completely changed the definition of what multiplication (redefined the function) and then called it not well-ordered... Sounds like cheating to me. Haha

But it would be great to go over proofs of well-defined. But from what I see, It looks like we just added another layer. Well defined --> Equivalence Classes --> Relations --> Modulos--> Divisiblity and then all back out!

Interesting:  Man, we use modular arithmetic all the time! Clocks is a great example, both minutes and hours are mod 60! Days of the week work in mod 7, but I guess days in a month gets messed up because it's (mod 28, 30, or 31). But still cool nonetheless!

Wednesday, October 15, 2014

8.5, Due October 17

Difficult: The most difficult thing to do is to figure out what to do with the symmetric property. Reflexive and transitive seem to happen pretty similarly, but proving symmetry seems to be a little more difficult. It's helpful when you show us writing the skeleton and why you write what you do, so examples like that will be awesome.

Interesting: I thought it was pretty tricky when they showed that not every modulo relation a(congurent)b (mod n) has n equivalence classes. Like a^2(congruent)b^2 (mod 3) only has two distinct equivalence classes. It doesn't make sense intuitively, but it's pretty interesting and makes sense  when you prove it and such.

Monday, October 13, 2014

8.3-8.4, Due October 15

Difficult: Hold on here, so remind me, can you have the same element twice in a set? I'm assuming not, so {3,2,1+1} is bad and should be written as {3,2}? Otherwise , Theorem 8.3 would not make sense since there would be multiple, equivalent sets in the set P (the partition). Or maybe we just don't count them twice, maybe could you remind us about that?

Again, like a lot of things in this class, there are a big handful of concepts to master that will take practice. So examples rock.

Interesting: It is way cool that we can call things equivalence relations if they are reflexive, symmetric, AND transitive. I went back through the examples we did in class and realized that mod's are reflexive, symmetric, and transitive, so it's an equivalence relation! Which means... we're probably going have to start proving more of those soon.

Sunday, October 12, 2014

8.1-8.2, Due October 13

Difficult: The most difficult thing was definitely the transitive property. I was familiar with all three of these properties, but not with relations. It was okay when I had an explicit set of points to look and see if it was transitive. However, when it became more vague, such as |a-b|<1, then it became much more difficult to visualize. The most helpful thing would probably be working on examples to get these concepts down. I feel, just like we did with sets, we're going to learn these concepts, and then have to prove them to the maximus in upcoming lessons.

Interesting: I'm a really visual guy, so when I had to see if a set was transitive for specific, explicit elements of a set, I would just picture the ordered pairs bumping into each other. (They weren't allowed to rotate, but they could move and bump other elements in the set) If the right side of one bumped into the left side of one and they matched, then the middle two numbers would disappear and leave me with a new ordered pair. That ordered pair had to be present, or else the set was not transitive.

I know that's kind of goofy, but it helped.

Thursday, October 9, 2014

6.4, Due October 10

Difficult: This is most certainly the most difficult lesson I have read so far. I just don't get why we are using i... that inductive step is just throwing me off! So is all we're doing is more base cases so we can assume more about that k is a bigger number? I thought we already did that...

When you throw four variables in there, (a, k, i and n) then these proofs get so much harder to follow. I can't even understand this concept well enough to make a sensible question about it: it's all just a blur...  I watched a YouTube video about strong induction, but it still didn't make that much sense at all. Tomorrow will be filled with lots of questions and examples hopefully.

Well, I guess one question, why are recursive sequences always shown in sets? Is that just... how they are?

Interesting: I'm sure this will be much more interesting when I can understand what this is saying.  Recursive sequences, however, are very interesting. Like the Fibonacci sequence and how it is EVERYWHERE in nature. That's pretty cool stuff!

Math Careers- Financial Engineering

I thought it was really interesting how highly he thought of a math degree and how much math intersects other fields of learning. They explained the role of mathematics in their job really well; they talked about how derivatives, multivariate statistics, and differentials played into their buying and selling. They also used extrapolation to predict how prices will raise or lower over a set amount of time. Trends over time are much more accurate when the math behind the algorithms are more sound. Being a financial engineer requires much innovation, critical thinking, and creativity.

It was difficult to understand all the financial terms they were discussing about markets, pricing, and modeling. But it was still interesting nonetheless.

Monday, October 6, 2014

6.2, Due October 8

Difficult: One concept I wan't sure on is when do you want to use induction as opposed to other proofs? We learned some of the basic key indicators of when to use direct, contrapositive, and contradiction proofs. Are there similar components of proofs requiring induction? Is it any time you would have to do an infinite number of proofs?

Also, the example with proving the cardinality of a power set is 2^n was tricky to follow, so an example in class would help a lot on that one.

Oh, and do we need to write "By the Principle of Mathematical Induction..." at the end of each of our proofs?

Interesting: I am really glad we are learning induction, as it is one of the main proof techniques we use in our Putnam class. I find it interesting that we are able to do so many different types of proofs using induction. From even/odds, to divisibility, to sets, they all work out!

Thursday, October 2, 2014

Exam Review, Due October 3

  • Which topics and theorems do you think are the most important out of those we have studied?
I personally think the most important topics are the three kinds of proofs and the axioms. As long as you are able to remember those, you should be able to prove theorems if you dont have them explicitly memorized
  • What kinds of questions do you expect to see on the exam?
I expect to see a lot of proofs using the principles of sets we learned in chapter one. This is the best way to test comprehensive knowledge, because you have to know both the information on sets and the strategies to conduct a proper proof.
  • What do you need to work on understanding better before the exam? Come up with a mathematical question you would like to see answered or a problem you would like to see worked out
The biggest thing I need to work on, (after taking the practice exam) is not to mess up on the little things and make sure I have all the definitions down soundly. I would like to see a couple of proofs involving sets (them being equal and subsets), and problems with congruence and modulo stuff.