Non-Clashing Teaching Maps for Balls in Graphs

09/06/2023
by   Jérémie Chalopin, et al.
0

Recently, Kirkpatrick et al. [ALT 2019] and Fallat et al. [JMLR 2023] introduced non-clashing teaching and showed it to be the most efficient machine teaching model satisfying the benchmark for collusion-avoidance set by Goldman and Mathias. A teaching map T for a concept class C assigns a (teaching) set T(C) of examples to each concept C ∈C. A teaching map is non-clashing if no pair of concepts are consistent with the union of their teaching sets. The size of a non-clashing teaching map (NCTM) T is the maximum size of a T(C), C ∈C. The non-clashing teaching dimension NCTD(C) of C is the minimum size of an NCTM for C. NCTM^+ and NCTD^+(C) are defined analogously, except the teacher may only use positive examples. We study NCTMs and NCTM^+s for the concept class ℬ(G) consisting of all balls of a graph G. We show that the associated decision problem B-NCTD^+ for NCTD^+ is NP-complete in split, co-bipartite, and bipartite graphs. Surprisingly, we even prove that, unless the ETH fails, B-NCTD^+ does not admit an algorithm running in time 2^2^o(vc)· n^O(1), nor a kernelization algorithm outputting a kernel with 2^o(vc) vertices, where vc is the vertex cover number of G. These are extremely rare results: it is only the second (fourth, resp.) problem in NP to admit a double-exponential lower bound parameterized by vc (treewidth, resp.), and only one of very few problems to admit an ETH-based conditional lower bound on the number of vertices in a kernel. We complement these lower bounds with matching upper bounds. For trees, interval graphs, cycles, and trees of cycles, we derive NCTM^+s or NCTMs for ℬ(G) of size proportional to its VC-dimension. For Gromov-hyperbolic graphs, we design an approximate NCTM^+ for ℬ(G) of size 2.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset