Sally Floyd: Other Papers
- Floyd, S., and Warmuth, M.,
Sample compression, learnability, and the Vapnik-Chervonenkis
dimension. Machine Learning, V. 21, p. 269, 1995.
Abstract from ResearchIndex.
- Floyd, S. and Karp, R., FFD Bin-packing for Items with
Distributions on [0, 1/2].
Algorithmica V.6, No.2, 1991, p.222-240. An earlier version of
this paper appeared in
27th Annual Symposium on Foundations of Computer Science, 1986,
pp. 322-330, October 1986.
"A number of fundamental results in science,
mathematics, and engineering concern critical phenomena, in which the
behavior of a system depends discontinuously on some parameter.
For example, the behavior of a queueing system depends critically
on the ratio between the arrival rate of customers and the service rate
of the server. ...
In a recent paper [BLJM], a critical phenomenon is reported
in the behavior of the first-fit decreasing (FFD) bin-packing
algorithm when the items are drawn independently from the uniform
distribution over [0, mu]. ... The present paper connects the
critical phenomenon discovered in [BLJM] to the critical
phenomenon mentioned above concerning queueing systems."
- Floyd, S., On Space-Bounded Learning and the
Vapnik-Chervonenkis Dimension. Ph.D. thesis,
International Computer Science Institute Technical Report TR-89-061,
ICSI,
Berkeley CA,
December 1989.
"This thesis explores algorithms that learn a concept from a concept
class of Vapnik-Chervonenkis (VC) dimension d by saving at most d examples
at a time. The framework is the model of probably approximately correct
(pac) learning introduced by Valiant [V84]. A maximum concept class
of VC dimension d is defined. For a maximum class C of VC dimension d,
we give an algorithm for representing a finite set of positive and
negative examples of a concept by a subset of d labeled examples of that
set. This data compression scheme of size d is used to construct a
space-bounded algorithm called the iterative compression algorithm
that learns a concept from the class C by saving at most d examples
at a time. These d examples represent the current hypothesis of the
learning algorithm. A space-bounded algorithm is called acyclic if
a hypothesis that has been rejected as incorrect is never reinstated.
We give a sufficient condition for the iterative compression algorithm
to be acyclic on a maximum class C. Classes for which the iterative
compression algorithm is acyclic include positive half-spaces
in Euclidean space E^n, balls in E^n, and arbitrary rectangles and triangles
in the plane. The iterative compression algorithm can be thought of
as learning a boundary between the positive and negative examples."
- Floyd, S., On Space-Bounded Learning and the
Vapnik-Chervonenkis Dimension.
Proceedings of the
Second Annual Workshop on Computational Learning Theory,
Morgan Kaufmann, 1989. This is a survey of results from the thesis.
Return to
[
Sally Floyd].
Sally Floyd, floyd at ee.lbl.gov
Last modified: January, 2001