Multitask Learning via Shared Features: Algorithms and Hardness

09/07/2022
by   Konstantina Bairaktari, et al.
0

We investigate the computational efficiency of multitask learning of Boolean functions over the d-dimensional hypercube, that are related by means of a feature representation of size k ≪ d shared across all tasks. We present a polynomial time multitask learning algorithm for the concept class of halfspaces with margin γ, which is based on a simultaneous boosting technique and requires only poly(k/γ) samples-per-task and poly(klog(d)/γ) samples in total. In addition, we prove a computational separation, showing that assuming there exists a concept class that cannot be learned in the attribute-efficient model, we can construct another concept class such that can be learned in the attribute-efficient model, but cannot be multitask learned efficiently – multitask learning this concept class either requires super-polynomial time complexity or a much larger total number of samples.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset