作者: Azad Naik , Anveshi Charuvaka , Huzefa Rangwala
关键词:
摘要: Multi-task learning (MTL) is a supervised paradigm in which the prediction models for several related tasks are learned jointly to achieve better generalization performance. When there only few training examples per task, MTL considerably outperforms traditional Single task (STL) terms of accuracy. In this work we develop an based approach classifying documents that archived within dual concept hierarchies, namely, DMOZ and Wikipedia. We solve multi-class classification problem by defining one-versus-rest binary each different classes across two hierarchical datasets. Instead linear discriminant independently, use with relationships between datasets established using non-parametric, lazy, nearest neighbor approach. also evaluate transfer (TL) compare (and TL) methods against standard single semi-supervised approaches. Our empirical results demonstrate strength our developed show improvement especially when fewer number task.