User:Zhuxiao

Welcome to ZhuXiao' Research Wiki! Here lists books, notes and papers for the subjects/topics which Zhuxiao has been highly interested in. These can be classified into two main categories, I am sincere to invite you to jointly study these subjects with me. (We may set up a online study group for the interested subject.) Welcome to direct any comment to zhuxiao_ee(at)yahoo(dot)com(dot)cn
 * 1) Fundamental Studies in Analog IC design, and
 * 2) Engineering Applications in Eletronic Design Automation.

Data Mining

 * Conferences and Their Deadlines
 * BOOKS
 * 1) Jiawei Han and Micheline Kamber (2005). Data Mining: Concepts and Techniques (2nd ed.). Morgan Kaufman Publishers, ISBN:1558604898.
 * 2) Ian H. Witten and Eibe Frank (2005). Data Mining: Practical Machine Learning Tools and Techniques (2nd ed.). Morgan Kaufmann, ISBN:0120884070.
 * 3) Pang-Ning Tan, Michael Steinbach and Vipin Kumar (2005). Introduction to Data Mining. Addison Wesley, ISBN:0321321367.
 * 4) Mehmed Kantardzic (2003). Data Mining: Concepts, Models, Methods, and Algorithms. John Wiley & Sons, ISBN:0471228524.
 * NOTES
 * 1) Jeffrey D. Ullman (2005). Lecture Notes(I) Lecture Notes(II). Course CS345: Data Mining, Stanford University.
 * 2) Inderjit S. Dhillon (2006). Lecture Notes. Course CS378: Introduction to Data Mining, Univ of Texas at Austin.
 * 3) Chris Clifton (2005). Lecture Notes. Course CS590D: Data Mining, Purdue University.
 * 4) Christoph F. Eick (2005). Lecture Notes. Course COSC 6397: Data Mining, University of Houston.
 * 5) Andrew Moore (200X). Data Mining Tutorials Computer Science Dept., Carnegie Mellon University.
 * PAPERS on  Sequential Data Mining 
 * PAPERS on   Association Rule Mining 

Statistical Learning Theory

 * BOOKS
 * 1) (*) Vapnik, V.N. (1999). The Nature of Statistical Learning Theory (2nd ed.). Springer-Verlag.
 * 2) Vapnik, V.N. (1996). Statistical Learning Theory. Wiley-Interscience. (E)
 * 3) Hastie, T., Tibshirani, R., and Friedman, J. (2001). The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer.
 * 4) Cristianini, N. and Shawe-Taylor John. (2000). An Introduction to Support Vector Machines and Other Kernel-based Learning Methods. Cambridge University Press. (X)
 * NOTES
 * 1) Poggio, T., Rakhlin, S., Caponnetto, A. and Rifkin, R. (2006). Lecture Notes. Course 9.520: Statistical Learning Theory and Applications. M.I.T.
 * 2) Jordan, M. (2004). Lecture Notes. Course CS281B/Stat241B: Statistical Learning Theory, UC Berkeley.
 * 3) Ben-David, S. (2003) Lecture Notes. Course ECE695: Statistical Learning Theory, Cornell University.
 * PAPERS
 * 1) Vapnik, V.N. (1999). An Overview of Statistical Learning Theory. IEEE Tran. Neural Networks, 988-999. (04/24/2006) Comments: Vipnik presents a 'very' high-level overview of statistical learning theory. But the good part is that readers can use this paper as a guideline to read his two books listed above. The Russian theorist does not tell you much in detail from this paper but from his books.
 * 2) Bousquet, O., Boucheron, S. and Lugosi, G. (2004). Introduction to Statistical Learning Theory. Advanced Lectures on Machine Learning Lecture Notes in Artificial Intelligence 3176, 169-207. (04/22/2006) Comments: Authors provide a thorough mathematical foundation to cover statistical learning theory assuming that readers possess the idea of what statistical learning is. Although some sections are not self-contained, this paper is still a good reference to develop a deeper view of statistical learning theory.
 * 3) Vert, J.-P., Tsuda, K. and Scholkopf, B. (2004). A Primer On Kernel Methods. MIT Press, Cambridge, MA. (10/19/2006) Comments: This is the best introductory paper I've ever read for kernel methods and support vector machine. Mathematical terms is clearly defined and explained so tha readers can understand it very quickly.

Computational Learning Theory (a.k.a. Machine Learning Theory)
The sutdy on computational learning theory does not have a bible book since this research field germinates from early 90s. Kearn's book and Schapire's lecture notes are suggested to be good starting points.
 * BOOKS
 * 1) (*) Kearns, M.J. and Vazirani, U.V. (1994). An Introduction to Computational Learning Theory. The MIT Press. (X)
 * 2) Kearns, M.J. (1990). The Computational Complexity of Machine Learning. The MIT Press. (E)
 * NOTES
 * 1) Schpire, Rob. (2005). Lecture Notes. Course: CS511: Foundation of Machine Learning. Princeton University. (04/29/2006) Comments: Prof. Schapire has taught this course for a couple of years. The materials well cover the most important foundamental concepts of machine learning from both computational and statistical prespect. This is one of my favorite course website.
 * 2) Rivest, R. (1994). Lecture Notes. Course: 6.858/18.428 Machine Learning. M.I.T. (04/29/2006) Comments: This course website is recommended by my advisor, Prof. Li.C Wang. However, Prof. Rivest seems not to teach this course after 1994. Not quite sure if these materials are up-to-date.
 * 3) Mitchell, T. and Moore., Andrew (2005). Lecture Notes Course 10-701/15-781. Carnegie Mellon University.
 * PAPERS

Theory of Computation (a.k.a. Complexity Theory, Automata Theory)

 * BOOKS
 * 1) (*) Wegener, I. (2005). Complexity Theory: Exploring the Limits of Efficient Algorithms. Springer.
 * 2) (*) Sipser, M. (1996). Introduction to the Theory of Computation. Course Technology
 * 3) Linz, P. (2000). An Introduction to Formal Languages and Automata. Jones & Bartlett Publishers.
 * 4) Hopcroft, J.E., Motwani, R. and Ullman J.D. (2000). Introduction to Automata Theory, Languages, and Computation (2nd ed.). Addison-Wesley
 * 5) Kohavi, Z. (1978). Switching and Finite Automata Theory. TATA McGraw-Hill.
 * 6) Straubing, H. (1994). Finite Automata, Formal Logic, and Circuit Complexity. Birhauser Boston.
 * 7) Vollmer, H. (1999). Introduction to Circuit Complexity: a Uniform Approach. Springer-Verlag.
 * 8) Papadimitriou, C.H., (1993). Computational Complexity. Addison-Wesley. (X)
 * 9) Papadimitriou, C.H and Steiglitz, K. (1998). Combinatorial Optimization : Algorithms and Complexity. Dover.
 * 10) Cormen, T.H., Leiserson, C.E., Rivest, R.L. and Stein, C. (2001). Introduction to Algorithms (2nd ed.). The MIT Press.
 * 11) Aho, A.V., Hopcroft, J.E. and Ullman, J.D. (1974). The Design and Analysis of Computer Algorithms. Addison-Wesley.
 * NOTES
 * PAPERS

Information Theory (a.k.a. Communication Theory, Coding Theory)

 * BOOKS
 * 1) (*) MacKay, D. (2003). Information Theory, Inference, and Learning Algorithm. Cambridge University Press.
 * 2) Cover, T.M. and Thomas, J.A. (2006). Elements of Information Theory (2nd ed.). Wiley-Interscience. (X)
 * NOTES
 * PAPERS

Mathematical Logic

 * BOOKS
 * 1) Ebbinghaus, H.-D., Flum, J. and Thomas, W. (1984). Mathematical Logic (Undergraduate Texts in Mathematics). Springer. (E)
 * 2) Ben-Ari, M. (2003). Mathematical Logic for Computer Science. Springer. (E)
 * NOTES
 * PAPERS

Probability and Statistics

 * BOOKS
 * 1) Papoulis, A. and Pillai, S.U. (2002). Probability, Random Variables and Stochastic Processes. McGraw-Hill.
 * 2) Hsu, H.P. (1997), Theory and Problems of Probability, Random Variables, and Random Processes. Mc-Graw Hill. (E)
 * 3) Jaynes, E.T. (1995). Probability Theory : The Logic of Science. Cambridge University Press. (E)
 * 4) Ross, S.M. (1970). Applied Probability Models with Optimization Applications. Dover.
 * 5) Tabachnick, B.G. and Fiedell, L.S. (2001). Using Multivariate Statistics (4th ed.). Allyn & Bacon.
 * NOTES
 * PAPERS

Engineering Applications in Eletronic Design Automation continues...

note:
 * 1) The bibliography is arranged in Springer-Verlag's style, not IEEE/ACM style.
 * 2) (X) indicates that Charles does not have this material at hand.
 * 3) (*) indicates the materials I am studying or is going to study in the near future.