1.[Paper: SelectiveNet - A Deep Neural network with an Integrated Reject Option by...03:00
2.Motivation03:54
3.Uncertainty Landscape04:14
4.The Story of Alice and Bob05:09
5.High Level Overview05:50
6.Supervised Learning06:26
7.Selective Classification06:50
8.Risk Coverage Tradeoff07:39
9.Confidence Rate Functions08:41
10.Confidence - Softmax Response09:24
11.Confidence - MC -Dropout10:17
12.From Uncertainty to
Selective Classifier11:01
13.SR or MC-dropout?11:21
14.Softmax Response is Not Optimal12:09
15.SelectiveNet13:38
16.Optimization14:43
17.Back to Bob and Alice15:51
18.Auxiliary Output16:33
19.Empirical Results17:50
20.Embedding Analysis18:44
21.Conclusion19:37
22.Q/A20:06
23.[Paper: Manifold Mixup - Better Representations by Interpolating Hidden States b...22:49
24.Troubling Properties of Deep Networks23:26
25.Manifold Mixup - Simple Algorithm24:49
26.Manifold Mixup - Great Results (external)26:04
27.Manifold Mixup - Surprising Properties26:48
28.Manifold Mixup - Theory Justifying Properties27:35
29.What can Manifold Mixup do for you (applied)?28:12
30.What can you do for Manifold Mixup (theory)?28:27
31.[Paper: Processing Megapixel Images with Deep Attention-Sampling Models by Angel...29:13
32.How do DNNs process large images?29:41
33.Our contributions30:18
34.Soft Attention30:44
35.Attention Sampling30:59
36.Processing Megapixel Images with Deep Attention-Sampling Models31:38
37.Qualitative evaluation of the attention distribution 32:06
38.Thank you for your time!32:47
39.[Paper: TapNet-Neural Network Augmented with Task-Adaptive Projection for Few-Sh...33:33
40.Few-Shot Learning34:00
41.TapNet: Task-Adaptive Projection Network34:23
42.How to Construct Projection Space M36:02
43.Classification and Learning36:48
44.Observations37:02
45.Results and Conclusions37:27
46.[Paper: Online Meta-Learning by Chelsea Finn]38:08
47.In many Practical Situations38:26
48.The Online Meta-Learning Setting40:04
49.Follow the Meta-Leader (FTML):41:13
50.FTML: 41:48
51.Experiments42:09
52.Takeaways42:39
53.[Paper: Training Neural Networks with Local Error Signals by Lars H. Eidnes]43:20
54.Local learning43:43
55.Training each layer on its own works!45:36
56.The approach45:56
57.Similarity matching loss47:23
58.Results48:48
59.Optimization vs generalization52:33
60.Sim-loss + global backprop53:47
61.Results, back-prop free version54:12
62.Intriguing questions55:41
63.[Paper: GMNN-Graph Markov Neural Networks by Jian Tang]1:02:20
64.Semi-supervised Node Classification1:02:38
65.Related Work: Statistical Relational Learning1:02:59
66.Related Work: Graph Neural Networks1:04:28
67.GMNN: Graph Markov Neural Networks1:05:31
68.Two Graph Neural Networks co-train with
Each Other1:06:04
69.Experimental Results1:07:26
70.[Paper: Self-Attention Graph Pooling by Junhyun Lee]1:08:05
71.Research background
Motivation1:08:21
72.Goal1:09:25
73.Related Work1:09:54
74.Self-Attention Graph Pooling1:10:38
75.Evaluation1:11:11
76.Combating Label Noise in
Deep Learning using
Abstention1:12:45
77.[Paper: Combating label noise in Deep Learning using Abstention by Sunil Thulasi...1:12:46
78.A Practical Challenge for Deep
Learning1:12:55
79.Annotation is labor intensive!1:13:07
80.Approaches to large-scale labeling1:13:19
81.Label noise is an inconsistent mapping from
features X to labels1:13:46
82.The Deep Abstaining Classifier (DAC)1:13:57
83.Training a Deep Abstaining Classifier1:14:16
84.Abstention Dynamics1:14:52
85.The DAC gives state-of-art results in
label-noise experiments.1:15:20
86.Abstention in the presence of Systematic Label
Noise: The Random Monkeys Experim...1:15:42
87.Random Monkeys: DAC Predictions on
Monkey Images1:16:09
88.Image Blurring1:16:26
89.DAC Behavior on Blurred Images1:16:38
90.Conclusions1:16:53
91.Joint work with......1:17:08
92.[Paper: LGM-Net-Learning to Generate Matching Networks for Few-Shot Learning by ...1:17:36
93.Motivation1:17:55
94.Approach1:18:40
95.MetaNet Module(meta-learner)1:19:18
96.TargetNet Module(base-learner)1:19:48
97.Learning Algorithm1:19:59
98.Comparison1:20:43
99.Results on Synthetic Datasets1:21:12
100.At the poster: 1:22:03