-
Notifications
You must be signed in to change notification settings - Fork 0
/
emotion.html
253 lines (204 loc) · 6.2 KB
/
emotion.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
<html><head><meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1">
<title>Emotion project page</title>
<style type="text/css" media="screen">
html, body, div, span, applet, object, iframe, h1, h2, h3, h4, h5, h6, p, blockquote, pre, a, abbr, acronym, address, big, cite, code, del, dfn, em, font, img, ins, kbd, q, s, samp, small, strike, strong, sub, tt, var, dl, dt, dd, ol, ul, li, fieldset, form, label, legend, table, caption, tbody, tfoot, thead, tr, th, td {
border: 0pt none;
font-family: inherit;
font-size: 100%;
font-style: inherit;
font-weight: inherit;
margin: 0pt;
outline-color: invert;
outline-style: none;
outline-width: 0pt;
padding: 0pt;
vertical-align: baseline;
}
a {
color: #1772d0;
text-decoration:none;
}
a:focus, a:hover {
color: #f09228;
text-decoration:none;
}
a.paper {
font-weight: bold;
font-size: 12pt;
}
b.paper {
font-weight: bold;
font-size: 12pt;
}
* {
margin: 0pt;
padding: 0pt;
}
body {
position: relative;
margin: 3em auto 2em auto;
width: 800px;
font-family: Lato, Verdana, Helvetica, sans-serif;
font-size: 14px;
background: #eee;
}
h2 {
font-family: Lato, Verdana, Helvetica, sans-serif;
font-size: 18pt;
font-weight: 700;
}
h3 {
font-family: Lato, Verdana, Helvetica, sans-serif;
font-size: 16px;
font-weight: 700;
}
strong {
font-family: Lato, Verdana, Helvetica, sans-serif;
font-size: 13px;
}
ul {
list-style: circle;
}
img {
border: none;
}
li {
padding-bottom: 0.5em;
margin-left: 1.4em;
}
strong, b {
font-weight:bold;
}
em, i {
font-style:italic;
}
div.section {
clear: both;
margin-bottom: 1.5em;
background: #eee;
}
div.spanner {
clear: both;
}
div.paper {
clear: both;
margin-top: 0.5em;
margin-bottom: 1em;
border: 1px solid #ddd;
background: #fff;
padding: 1em 1em 1em 1em;
}
div.paper div {
padding-left: 200px;
}
img.paper {
margin-bottom: 0.5em;
float: left;
width: 170px;
}
div.dissert {
clear: both;
margin-top: 0.5em;
margin-bottom: 1em;
border: 1px solid #ddd;
background: #fff;
padding: 1em 1em 1em 1em;
}
div.dissert div {
padding-left: 150px;
}
img.dissert {
margin-bottom: 0.5em;
float: left;
width: 140px;
}
span.blurb {
font-style:italic;
display:block;
margin-top:0.75em;
margin-bottom:0.5em;
}
pre, code {
font-family: 'Lucida Console', 'Andale Mono', 'Courier', monospaced;
margin: 1em 0;
padding: 0;
}
div.paper pre {
font-size: 0.9em;
}
</style>
<script type="text/javascript" async="" src="./page_files/ga.js"></script><script type="text/javascript">
var _gaq = _gaq || [];
_gaq.push(['_setAccount', 'UA-7953909-1']);
_gaq.push(['_trackPageview']);
(function() {
var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true;
ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';
var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s);
})();
</script>
<script type="text/javascript" src="./page_files/hidebib.js"></script>
<link href="./page_files/css" rel="stylesheet" type="text/css">
<!--<link href='http://fonts.googleapis.com/css?family=Open+Sans+Condensed:300' rel='stylesheet' type='text/css'>-->
<!--<link href='http://fonts.googleapis.com/css?family=Open+Sans' rel='stylesheet' type='text/css'>-->
<!--<link href='http://fonts.googleapis.com/css?family=Yanone+Kaffeesatz' rel='stylesheet' type='text/css'>-->
<style id="style-1-cropbar-clipper">/* Copyright 2014 Evernote Corporation. All rights reserved. */
.en-markup-crop-options {
top: 18px !important;
left: 50% !important;
margin-left: -100px !important;
width: 200px !important;
border: 2px rgba(255,255,255,.38) solid !important;
border-radius: 4px !important;
}
.en-markup-crop-options div div:first-of-type {
margin-left: 0px !important;
}
</style></head>
<body>
<div class="paper" id="Panda:ECCV2018_Emo">
<img class="paper" title="ECCV 2018_Emo" src="./images/ECCV_2018.png">
<div>
<a class="paper" href="https://rpand002.github.io/Papers/ECCV_2018.pdf">Contemplating Visual Emotions: Understanding and Overcoming Dataset Bias</a><br>
<strong>Rameswar Panda</strong>, Jianming Zhang, Haoxiang Li, Joon-Young Lee, Xin Lu, Amit K. Roy-Chowdhury<br>
European Conference on Computer Vision (ECCV), 2018<br>
<span class="blurb"> We investigate different dataset biases and propose a curriculum guided webly supervised approch
for learning a generalizable emotion recognition model.</span>
</div>
<div class="spanner"></div>
</div>
<div class="section">
<h2>Datasets & Models </h2>
<div class="paper">
We introduce three image emotion datasets collected
from different sources for model training and testing. The first one is called <strong>WEBEmo</strong> dataset
that contains about 268000 stock photos across 25 fine-grained emotion categories. The other two datasets, <strong>Emotion-6</strong>
and <strong>UnBiasedEmo</strong> are collected from Google and Flickr to study dataset bias in visual emotion recognition. <br> <br>
<a href="https://drive.google.com/file/d/1qOY-kAFtPYfUY12qeI-IRq7pjJ1YpG_z/view?usp=sharing"> [WEBEmo] </a>
<a href="https://drive.google.com/file/d/1I7fWwi8TtjeA6EYleUQOD7jIFOA0Bpx9/view?usp=sharing"> [Emotion-6]</a>
<a href="https://drive.google.com/file/d/1pqWD6ItRNtQXXPUwV4fqhVewSG8Cfgx7/view?usp=sharing"> [UnBiasedEmo]</a>
<br> <br>
All the trained models will be released soon.
</div>
</div>
<div class="section">
<h2>Acknowledgements</h2>
<div class="paper">
This work is partially supported by NSF grant 1724341
and gifts from Adobe. We thank Victor Hill of UCR CS for setting up the computing infrastructure used in this work.
</div>
</div>
<div class="section">
<h2>Bibtex</h2>
<div class="paper">
<p> Please cite our paper if you find it useful for your research. <br>
<br>
@inproceedings{panda2018contemplating, <br>
title={Contemplating Visual Emotions: Understanding and Overcoming Dataset Bias}, <br>
author={Panda, Rameswar and Zhang, Jianming and Li, Haoxiang and Lee, Joon-Young and Lu, Xin and Roy-Chowdhury, Amit K}, <br>
booktitle = {European Conference on Computer Vision}, <br>
year={2018} <br>
} </p>
</div>
</div>
</body></html>