SubjectsSubjects(version: 953)
Course, academic year 2023/2024
   Login via CAS
Statistical Methods in Natural Language Processing I - NPFX067
Title: Statistické metody zpracování přirozených jazyků I
Guaranteed by: Student Affairs Department (32-STUD)
Faculty: Faculty of Mathematics and Physics
Actual: from 2021
Semester: winter
E-Credits: 6
Hours per week, examination: winter s.:2/2, C+Ex [HT]
Capacity: unlimited
Min. number of students: unlimited
4EU+: no
Virtual mobility / capacity: no
State of the course: taught
Language: Czech
Teaching methods: full-time
Teaching methods: full-time
Is provided by: NPFL067
Additional information:
Guarantor: prof. RNDr. Jan Hajič, Dr.
doc. RNDr. Pavel Pecina, Ph.D.
Class: DS, matematická lingvistika
Informatika Mgr. - Matematická lingvistika
Classification: Informatics > Computer and Formal Linguistics
Pre-requisite : {NXXX011, NXXX012, NXXX013, NXXX038, NXXX039, NXXX040, NXXX067, NXXX069, NXXX070, NXXX071}
Incompatibility : NPFL067
Interchangeability : NPFL067
Annotation -
Introduction to formal linguistics and the fundamentals of statistical natural language processing, including basics of Infromation Theory, Language MOdeling and Markov Models. Continues as Statistical Methods in Natural Language Processing II.
Last update: T_UFAL (20.05.2004)
Course completion requirements -

Turning in both homeworks (66,7 %), written exam (33,3 %). "Zápočet" is not a prerequisite for taking the exam. To get "zápočet", homework grade total must be at least 80 points (out of 200). Homework can be turned in max. three times, at the latest on the date announced on the course webpage. Every late day subtracts 5 points. Turning in the homework later than 10 days after the deadline, carries a constant penalty of 50 points.

Last update: Hajič Jan, prof. RNDr., Dr. (28.09.2020)
Literature -

Manning, C. D. and H. Schütze: Foundations of Statistical Natural Language Processing. The MIT Press. 1999. ISBN 0-262-13360-1.

Jurafsky, D. and J. Martin: Speech and Language Processing. Prentice Hall. Any edition (1st: 2000).

Cover, T. M. and J. A. Thomas: Elements of Information Theory. Wiley. 1991. ISBN 0-471-06259-6.

Last update: Hajič Jan, prof. RNDr., Dr. (28.09.2020)
Requirements to the exam -

There is one written exam, with 4-5 questions with sub-questions. The extent of the exam corresponds to the syllabus and to the material presented in the lectures and exercises. The net time allowed for finishing the exam is 60 minutes, and it is an open books type exam. Calculators are allowed. The grading is on the scale of 0 to 100 points. The weight of the points for the final grade is 33,3 %. The exam may be administered online.

Last update: Hajič Jan, prof. RNDr., Dr. (28.09.2020)
Syllabus -

Introduction. Course Overview: Intro to NLP. Main Issues.

The Very Basics on Probability Theory. Elements of Information Theory I. Elements of Information Theory II.

Language Modeling in General and the Noisy Channel Model. Smoothing and the EM algorithm.

Word Classes and Lexicography. Mutual Information (the "pointwise" version). The t-score. The Chi-square test. Word Classes for NLP tasks. Parameter Estimation. The Partitioning Algorithm. Complexity Issues of Word Classes. Programming Tricks & Tips.

Markov models, Hidden Markov Models (HMMs). The Trellis & the Viterbi Algorithms. Estimating the Parameters of HMMs. The Forward-Backward Algorithm. Implementation Issues.

Last update: Hajič Jan, prof. RNDr., Dr. (28.09.2020)
Charles University | Information system of Charles University |