CHAPTERS
1.Welcome05:39
2.[PAPER: NEURAL VOICE CLONING WITH A FEW SAMPLES]06:28
3.Motivations06:41
4.Voice Cloning07:21
5.Speaker Adaptation08:07
6.Speaker Adaptation Analysis08:34
7.Speaker Encoding08:57
8.Results09:50
9.Voice Morphing via Embedding Manipulation10:18
10.Thank you!10:44
11.[PAPER: ANSWER IN QUESTIONER'S MIND: INFORMATION THEORETIC APPROACH TO GOAL-ORIE...11:04
12.Problem Definition- GuessWhat?!11:34
13.Previous Architectures12:12
14.Our Method - AQM (Answerer in Questioner's Mind)12:56
15.Experiment Result14:10
16.[PAPER: NEUROSYMBOLIC VQA: DISENTANGLING REASONING FROM VISION AND LANGUAGE UNDE...14:47
17.Task: Visual Reasoning15:08
18.Clever Dataset15:49
19.Neural-Symbolic Visual Question Answering (NS-VQA)16:29
20.Advantage 1: High Accuracy18:06
21.Advantage 2: Data Efficiency18:26
22.Advantage 3: Transparency and Interpretability18:51
23.Summary19:16
24.[PAPER: LEARNING TO OPTIMIZE TENSOR PROGRAMS]19:39
25.Goal: Deploy Deep Learning Everywhere19:55
26.Existing Approach20:09
27.Limitations of Existing Approach20:43
28.Learning to Optimize Tensor Programs21:29
29.Search over Possible Program Transformations21:42
30.Learning-based Program Optimizer21:52
31.Transfer Learning Among Different Workloads22:54
32.State of Art Performance23:10
33.[PAPER: GENERALISATION OF STRUCTURAL KNOWLEDGE IN THE HIPPOCAMPAL-ENTORHINAL SYS...23:42
34.Introduction: Generalization by factorisation of structural knowledge24:24
35.Spatial cognition and neuroscience - a zoo of cells26:15
36.Task: using structure to predict experience28:40
37.Conjunctive code allows memory retrieval from location30:31
38.How to learn representations31:06
39.Technical interlude: thinking fast and slow31:52
40.Model performance and behaviour32:58
41.Learned grid-like representations that generalist33:35
42.Learned place-like representations34:17
43.Basis functions for building transition structures34:33
44.Preserved grid to place cell relationship across environments35:52
45.Conclusions36:40
46.Q/A37:18
47.[PAPER: A LIKELIHOOD-FREE INFERENCE FRAMEWORK FOR POPULATION GENETIC DATA USING ...39:28
48.Population Genetic Data40:06
49.Exchangeable Neural Network42:53
50.[PAPER: GENERALISING TREE PROBABILITY ESTIMATION VIA BAYESIAN NETWORKS]43:52
51.Problem Setup45:28
52.Learning SNNs47:16
53.Experiments47:53
54.[PAPER: SUGAR GEOMETRY BASED DATA GENERATION]48:30
55.Acknowledgements48:52
56.Introduction and Motivation49:00
57.Diffusion Geometry49:38
58.Data Generation with Diffusion49:53
59.Applications & results50:53
60.Conclusion52:13
61.[PAPER: POINT PROCESS LATENT VARIABLE MODELS OF LARVEL ZEBRAFISH BEHAVIOR]52:25
62.Key questions53:59
63.Modeling larval zebrafish behavior as a marked point process54:24
64.Point process latent variable models54:46
65.Come to our poster!56:57
66.[PAPER: A PROBABILISTIC POPULATION CODE BASED ON NEURAL SAMPLES]57:38
67.Why Probability in Perception?58:17
68.Sensory Input is noisy and ambiguous59:04
69.The "forward engineering" approach: how to build a brain?59:29
70.Neuroscience faces a "reverse engineering" problem1:00:23
71.Canonical Parametric Example: Linear Probabilistic Population Codes (PPC)1:01:45
72.Canonical Sampling Example: Neural Sampling1:02:37
73.Bayesian Encoding vs Decoding1:04:06
74.Road Ahead1:04:54
75.Sampling based Inference1:05:18
76.Orientation Estimation1:05:47
77.Geometric Intuition1:07:07
78.Characterizing the Implicit Parametric Code1:08:11
79.Implications of the model1:08:35
80.Summary1:09:17
81.Q/A1:10:14
82.[PAPER: SPARSE ATTENTIVE BACKTRACKING: TEMPORAL CREDIT ASSIGNMENT THROUGH REMIND...1:11:56
83.Credit assignment1:12:37
84.Credit assignment through time and memory1:13:20
85.Sparse Attentive Backtracking1:14:42
86.Some results1:15:32
87.Generalization and attention map1:15:46
88.[PAPER: LEARNING TEMPORAL POINT PROCESSES VIA REINFORCEMENT LEARNING]1:16:19
89.Motivation1:16:44
90.Point Process Model1:18:05
91.Traditional Maximum-Likelihood Framework1:19:35
92.New Reinforcement Learning Framework1:20:35
93.Optimal Reward1:21:48
94.Modeling Framework1:22:13
95.Numerical Results1:22:21
96.Poster1:22:48
97.[PAPER: PRECISION AND RECALL FOR TIME SERIES]1:22:58
98.Motivation: Time Series Anomaly Detection1:23:20
99.Motivation: Range-based Anomalies1:23:45
100.Problem: How to Measure Accuracy?1:24:38
101.State of the Art1:25:27
102.Precision and Recall for Time Series1:26:15
103.Selected Experimental Results1:26:58
104.Key Takeaways1:27:15
105.More Information1:27:39
106.[PAPER: BAYESIAN NONPARAMETRIC SPECTRAL ESTIMATION]1:27:59
107.What is the Spectral Representation?1:29:19
108.Def: Spectral Estimation1:29:43
109.Proposed model: 1:31:07
110.Key findings1:31:34
111.Two Experiments1:31:57
112.POSTER SESSION1:32:48
CHAPTERS
Powered byVideoKen
MORE
ShareCOPIED
Share Topic...
×
Search in Video
Feedback
Powered byVideoKen
SEARCH IN VIDEO
Transcript is not available for this video
Powered byVideoKen
FEEDBACK
Love it, will recommend
Good, but won't recommend
Bad, needs a lot of improvement
Submit Feedback
Powered byVideoKen