
% Synthetic illustration for binary classification
% References:
%   YanboFan, Siwei Lyu, Yiming Ying and Bao-Gang Hu. "Learning with
%   Average Top-K Loss", NIPS, 2017.
%
%   version 1.0 --OCT/2017 

-------------------------------------------------------------------

############# clean data #############
case1:  clean data, all methods achieve best performance


############# illustration on outlier #############
case2: case1 with one outlier, both average and matk perform good, minimax is influenced by the outlier


#############  illustration on multi-modality data #############
case3: multi-modality without outlier, matk and minimax are good, average is influenced by multi-modality, 

case4: multi-modality with outlier, matk performs good, while average is influenced by multi-modality and minimax is influenced by outlier, 


############# illustration on imbalance data #############
case5: imbalance without outlier,  both matk and minimax perform good, average is influenced by class-imbalance, 

case6: imbalance with one outlier, matk performs good, but minimax is influenced by outlier and average is influenced by class-imbalance, 



-----------------------------------------------------------------------------------------------------------------
illustration line 
case1->case2->case3
     ->case4->case5
     ->case6

individual loss: logistic(in main paper) and hinge (in supplementary)