Actions

Teaching: Difference between revisions

Computational Linguistics and Information Processing

Line 26: Line 26:


== Computational Linguistics II (Resnik) ==
== Computational Linguistics II (Resnik) ==
This is the second semester in our graduate sequence in computational linguistics. Students are assumed to have taken the first semester (Ling723/CMSC723) or equivalent, and this class will provide foundations for advanced seminars in computational linguistics. Students are expected to be able to know how to program, and will exercise this ability periodically in homework assignments and/or projects.  The topics we'll cover are intended to get students up to speed on necessary background in order to understand and perform cutting-edge research in natural language processing, which requires a strong grounding in statistical NLP models and methods. Some of the topics are in the same areas as in Computational Linguistics I, but we will go deeper. As always, the syllabus is subject to revision; however, it will follow Manning and Schuetze's textbook relatively closely at least in early parts of the course.  ([http://www.umiacs.umd.edu/~resnik/ling773_sp2010/ Recent course Web page])
== Cloud Computing (Boyd-Graber) ==
== Cloud Computing (Boyd-Graber) ==

Revision as of 00:33, 28 October 2010

Fall 2010

Computational Linguistics I (Daume)

Fundamental methods in natural language processing. Topics include: finite-state methods, context-free and extended context-free models of syntax; parsing and semantic interpretation; n-gram and Hidden Markov models, part-of-speech tagging; natural language applications such as machine translation, automatic summarization, and question answering.

Seminar in Computational Linguistics (Resnik)

This advanced seminar will focus on computational modeling of language, including cognitive/linguistic aspects as well as practical language technology. Bayesian and information theoretic approaches will figure prominently. The seminar will combine reviewing fundamental material, taking a reading-group approach to key advanced papers, and (hopefully) bringing in guest speakers. Cross-disciplinary participation is strongly encouraged. (Course Web page)

Machine Learning (Getoor)

Reviews and analyzes both traditional symbol-processing methods and genetic algorithms as approaches to machine learning. (Neural network learning methods are primarily covered in CMSC 727.) Topics include induction of decision trees and rules, version spaces, candidate elimination algorithm, exemplar-based learning, genetic algorithms, evolution under artificial selection of problem-solving algorithms, system assessment, comparative studies, and related topics.

Spring 2010

Computational Linguistics II (Resnik)

This is the second semester in our graduate sequence in computational linguistics. Students are assumed to have taken the first semester (Ling723/CMSC723) or equivalent, and this class will provide foundations for advanced seminars in computational linguistics. Students are expected to be able to know how to program, and will exercise this ability periodically in homework assignments and/or projects. The topics we'll cover are intended to get students up to speed on necessary background in order to understand and perform cutting-edge research in natural language processing, which requires a strong grounding in statistical NLP models and methods. Some of the topics are in the same areas as in Computational Linguistics I, but we will go deeper. As always, the syllabus is subject to revision; however, it will follow Manning and Schuetze's textbook relatively closely at least in early parts of the course. (Recent course Web page)

Cloud Computing (Boyd-Graber)