Computational
Learning Theory at NUS
Computational Learning Theory concerns the mathematical analysis of
algorithms that either adapt to their environments, or try to model parts of
them. The work of our group at NUS ranges from study of detailed theoretical
models of concrete applied problems to study of basic questions about more
abstract problems.
The members of the group are Sanjay
Jain, Lee Wee Sun and Frank Stephan.
Research
Topics
- Automatic
functions, linear time and learning.
We characterise automatic functions as those
which can be computed by a position-faithful one-tape Turing machine in
linear time and takes this as a starting point to investigate various
classes of learners which update in each cycle their memories in time
linear in the length of the longest datum seen so far.
- Learnability of
Automatic Structures.
In this project we consider learnability of
uniformly regular classes (automatic classes), under various constraints. Specially we consider the situation when the learners
themselves are required to be automatic.
- Enlarging
Learnable Classes.
An early result in inductive inference shows that the class of Ex-learnable
sets is not closed under unions. We considered the following question: For
what classes of functions is the union with an arbitrary Ex-learnable
class again Ex-learnable, either effectively or non-effectively? We show
that the effective case and the non-effective case separate, and we give a
sufficient criterion for the effective case. We also consider the
possibility of (effectively) extending learners to learn (infinitely) many
more functions. It is known that all Ex-learners learning a dense set of
functions can be effectively extended to learn infinitely more. It was
open whether the learners learning a non-dense set of functions can be
similarly extended. We show that this is not possible, but we give an
alternative split of all possible learners into two sets such that, for
each of the sets, all learners from that set can be effectively extended.
We analyze similar concepts also for other learning criteria.
- The Amount
of Nonconstructivity in Learning Formal
Languages from Positive Data.
We study the amount of nonconstructivity needed
to learn classes of formal languages from positive data. Different
learning types
are compared with respect to the amount of nonconstructivity needed to learn indexable
classes and recursively enumerable classes, respectively, of formal
languages from positive data. Matching upper and lower bounds for the
amount of nonconstructivity needed are shown.
- Partial
Consistent Learning.
Partial Identification deals with outputting a unique correct hypothesis
infinitely often, and all other hypotheses only
finitely often. This allows one to learn the class of all recursive
functions. We considered consistent partial identification. We considered
three versions of consistency. We also considered whether the input is
given in canonical order or arbitrary order. We obtained several
characterization results, and also showed that for arbitrary input
presentation, for one of the versions of consistency (called TCons), consistent partial
learning is equivalent to normal consistent learning in the limit. For
canonical input, there are exactly two inference degrees for TCons-partial learning (omniscient and trivial).
- Optimal
Numberings.
In this work we considered which numberings are optimal for learning in
the sense that every learnable class can be learnt using these numberings
as hypothesis space. We showed that the set of numberings optimal for
various learning criteria are usually incomparable.
- Mitoticity.
Mitoticity deals with whether a class can be
split into two parts with equal complexity. We considered this with
respect to intrinsic complexity and showed that complete classes can be so
split, but not necessarily other classes.
- Complexity of Approximate Planning.
Planning for partially observable Markov decision processes (POMDP) is
known to be intractable in the worst case. We study subclasses of POMDP
planning problems and factors that make some of them easier to solve.
Publications
Some recent publications related to above research topics are:
- Automatic functions, linear time and learning. John
Case, Sanjay Jain, Samuel Seah and Frank
Stephan. In How the World Computes - Turing Centenary Conference and
Eighth Conference on Computability in Europe, CiE
2012, Cambridge, UK, June 18--23, 2012. Springer LNCS 7318:96--106, 2012.
- Automatic Learning of Subclasses of Pattern Languages. John
Case, Sanjay Jain, Trong Dao Le, Yuh Shin Ong, Pavel Semukhin, and Frank Stephan. In Information and Computation,
Volume 218, pages 17--35, 2012.
- Enlarging Learnable Classes. Sanjay Jain, Timo Kotzing and Frank
Stephan. In Nader Bshouty, Gilles Stoltz, Nicolas Vayatis and
Thomas Zeugmann, editors, Algorithmic Learning
Theory, 23nd International Conference, ALT' 12, 2012, pages 36--50. Lecture
Notes in Artificial Intelligence 7568. Springer Verlag,
2012.
- On the Amount of Nonconstructivity
in Learning Formal Languages from Positive Data. Sanjay Jain, Frank
Stephan and Thomas Zeugmann. In Manindra Agrawal, S. Barry
Cooper and Angsheng Li, editors, Theory and
Applications of Models of Computation, 9th Annual Conference, (TAMC) 2012,
pages 423--434. Lecture Notes in Computer Science 7287. Springer Verlag, 2012.
- Iterative Learning from Texts and Counterexamples Using
Additional Information. Sanjay Jain and Efim Kinber. In Machine Learning, Volume 84, Number 3,
pages 291--333, 2011.
- Learnability of Automatic Classes. Sanjay
Jain, Qinglong Luo and
Frank Stephan. In Journal of Computer and System Sciences, Volume 78, issue
6, Pages 1910--1927, 2012. (Special issue on LATA 2010).
- Numberings Optimal for Learning. Sanjay Jain and Frank
Stephan. In Journal of Computer and System Sciences, Volume 76, Pages
233--250, 2010.
- Uncountable Automatic Classes and Learning. Sanjay
Jain, Qinglong Luo, Pavel Semukhin and Frank
Stephan. In Algorithmic Learning Theory, 20th International Conference,
ALT' 09, Porto, Portugal, October 2009, pages 293--307. Lecture Notes in
Artificial Intelligence 5809. Springer Verlag,
2009.
- Consistent Partial Learning. Sanjay Jain and Frank
Stephan. In, Adam Klivans and Sanjoy Dasgupta, editors,
Conference on Learning Theory, Montreal, Canada, 2009.
- Mitotic Classes in Inductive Inference. Sanjay Jain and
Frank Stephan. In Siam Journal on Computing, Volume 38, Number 4, Pages
1283--1299, 2008.
- What Makes Some POMDP Problems Easy to Approximate?
David Hsu, Wee Sun Lee and Nan Rong. In Neural
Information Processing Systems, 2007.
More publications can be found in the group members' home pages.
Mail comments about this page