Actions

Teaching

Computational Linguistics and Information Processing

Revision as of 18:54, 14 August 2010 by Jbg (talk | contribs)

Fall 2010

Computational Linguistics I (Daume)

Fundamental methods in natural language processing. Topics include: finite-state methods, context-free and extended context-free models of syntax; parsing and semantic interpretation; n-gram and Hidden Markov models, part-of-speech tagging; natural language applications such as machine translation, automatic summarization, and question answering.

Seminar in Computational Linguistics (Resnik)

This advanced seminar will focus on computational modeling of language, including cognitive/linguistic aspects as well as practical language technology. Bayesian and information theoretic approaches will figure prominently. The seminar will combine reviewing fundamental material, taking a reading-group approach to key advanced papers, and (hopefully) bringing in guest speakers. Cross-disciplinary participation is strongly encouraged.

Machine Learning (Getoor)

Reviews and analyzes both traditional symbol-processing methods and genetic algorithms as approaches to machine learning. (Neural network learning methods are primarily covered in CMSC 727.) Topics include induction of decision trees and rules, version spaces, candidate elimination algorithm, exemplar-based learning, genetic algorithms, evolution under artificial selection of problem-solving algorithms, system assessment, comparative studies, and related topics.

Spring 2010

Computational Linguistics II (Resnik)

Cloud Computing (Boyd-Graber)