-
Institution:
-
Johns Hopkins University
-
Subject:
-
-
Description:
-
Prerequisites: With instructor permission, this course is open to upperclass undergraduates concentrating in computation. Recently, statistical learning has played a leading role in informing the empiricist/nativist and connectionist/ symbolic debates. But just what is “statistical learning” and what’s new about it? This course presents theories of statistical learning, such as Bayesian models, causal networks, information-theoretic models (e.g., Minimum Description Length and Maximum Entropy formalisms). These methods have caused revolutions in machine vision and natural language processing. During the course, these methods will be compared with other numerical learning methods such as connectionist networks, and with non-numerical learning theories such as Gold’s classic learnability theory and its probailistic extension to PAC (probably approximately correct) learning theory. This recent work has fundamental implications for the ancient problem of induction. Prerequisites: With instructor permission, this course is open to upperclass undergraduates concentrating in computation.
-
Credits:
-
3.00
-
Credit Hours:
-
-
Prerequisites:
-
-
Corequisites:
-
-
Exclusions:
-
-
Level:
-
-
Instructional Type:
-
Lecture
-
Notes:
-
-
Additional Information:
-
-
Historical Version(s):
-
-
Institution Website:
-
-
Phone Number:
-
(410) 516-8000
-
Regional Accreditation:
-
Middle States Association of Colleges and Schools
-
Calendar System:
-
Semester
Detail Course Description Information on CollegeTransfer.Net
Copyright 2006 - 2026 AcademyOne, Inc.