Wikipedia:Reference desk/Archives/Mathematics/2013 November 5

= November 5 =

Book on "Logics"
Lately I've been studying Descriptive Complexity, in the context of finite models it discusses various extensions of FO (like transitive closure, various fixed point operators, etc.). I find the various methods of extending a given logic, and what a "logic" is exactly, quite interesting (maybe more than DC itself); so, I was wondering, are there any books that cover this (and aren't impossible to read- hard is fine, though). Specifically, what I'd really really like is something that takes the approach of books on algebra that start with a general idea of the subject, then investigate specific spaces satisfying specific constraints (groups, rings, etc.) with the focus being on the various structures and what relates them. Essentially, I'd like something more focused on "logics" as an object of study, with less of a focus on logics that we can actually use (if that makes sense...). Thanks for any help:-)Phoenixia1177 (talk) 06:55, 5 November 2013 (UTC) Since the above might not make the most sense, here are the types of questions I'd be interested in (these specifically need not be covered, just things like them): Obviously, all of this will depend on just what "logic" means- and that question is the central interest, I suppose.Phoenixia1177 (talk) 07:33, 5 November 2013 (UTC)
 * Given a logic, say FO, how do you characterize sentences that need at least n different variable symbols? Or, rather, what can you say if you only have n variable symbols available?
 * Are there quantifiers besides existential and universal? What exactly is quantification?
 * Are logics fairly hierarchical, or is there branching, in terms of power? Is it a farily linear sequence starting at FO and working up to SO, etc. Or are there are extensions that go in completely different directions?
 * Are there things that could be called logic that do not fall into the nth-order family?
 * What about logics that aren't bivalent? Supposing that we can consider logics that take truth values in some lattice(ish) space, how does the lattice structure imapact logical strength? What is quantification here?
 * Can you do something analogous to the relation between Euclidean and Absolute Geometry with the relation between Classical and Intuitionisitc logic? If yes, are there non-classical extensions to classical logic that are incompatible with it (like various contradicts of the parallel postulate give)?
 * You don't need both quantifiers in classical logic ∀x P(x) = ¬ ∃x ¬ P(x). I guess you could manufacture others, such as ¬∃ (non-existence) and ¬∀ (non-universality) but I am not aware of anyone who does. As for non-True/False logics, see Fuzzy Logic. 51kwad (talk) 15:13, 7 November 2013 (UTC)


 * I think you might want a textbook on metalogic, rather than "logic" per se. That topic will cover more of the analysis of how different logical systems relate to each other, rather than teach you how to perform tasks and prove theorems using specific logic schemes. You also get the "fun" of learning how to properly read/write metasyntax, and keeping metasyntactic variables separate from their mundane couterparts. If that seems right to you, I can probably get a specific metalogic textbook recommendation from my colleagues. SemanticMantis (talk) 22:11, 7 November 2013 (UTC)
 * E.g. "Metalogic: an Introduction to the Metatheory of Standard First Order Logic" by Hunter (here on Amazon ) -- it is restricted to first order logic so it won't specifically address you questions, but it's cheap/readily available, and would let you figure out it that's the kind of perspective/analysis you want. Also, it's usually good to look over an introductory book before you dive off the deep end ;) SemanticMantis (talk) 22:18, 7 November 2013 (UTC)
 * Also, if you're not already familiar, check out books on modal logic, which uses some extra logical connectives, and books on the topic usually contain some introductory metalogic. SemanticMantis (talk) 22:23, 7 November 2013 (UTC)
 * Thank you:-) That's exactly the type of thing I'm looking for:-) If you come across any other recommendations, I'd be extremely appreciative; I just put an order out for the book you recommended. Thank you again:-)Phoenixia1177 (talk) 04:35, 8 November 2013 (UTC)

Beautiful Alternate Expression for the Beta function

 * $$\int_0^\infty\frac{x^a}{(1+x)^b}=\Beta(a,b-a)$$

Does anyone know how to prove or explain this identity ? — 79.118.191.86 (talk) 22:51, 5 November 2013 (UTC)


 * Beta function says (with a citation) that it can be written

\Beta(x,y) = \int_0^\infty\dfrac{t^{x-1}}{(1+t)^{x+y}}\,dt, \qquad \mathrm{Re}(x)>0,\ \mathrm{Re}(y)>0 \!$$


 * which is almost equivalent to your expression with x= your a, y = your b-a, and t = your x, except that this has a "-1" in the numerator that you don't have. Maybe this expression is proven in the cited reference, which is Davis (1972) 6.2.1 p.258. Duoduoduo (talk) 23:57, 5 November 2013 (UTC)
 * If I would've had access to that book, I wouldn't have asked the question in the first place... :-) — 79.113.194.164 (talk) 04:41, 6 November 2013 (UTC)
 * Nevermind, one just takes $$\frac1{1+x}=t,$$ and the rest follows. — 79.113.194.164 (talk) 05:08, 6 November 2013 (UTC)