Jekyll2021-09-25T06:54:44+00:00https://fkeel.github.io/feed.xmleyewear-pro.github.ioweb pageCognitive Load Dataset in the works2020-02-17T00:00:00+00:002020-02-17T00:00:00+00:00https://fkeel.github.io/posts/2020/02/17/cognitive<p>This dataset includes all thermal data of the face electrooculography recordings for 20 users of smart glasses.</p>This dataset includes all thermal data of the face electrooculography recordings for 20 users of smart glasses.Alertness EOG dataset released2019-04-21T00:00:00+00:002019-04-21T00:00:00+00:00https://fkeel.github.io/posts/2019/04/21/alertness<p>https://zenodo.org/record/2532900#.XlhF30Mo924</p>
<p>This dataset includes all electrooculography recordings for 16 users of J!NS MEME glasses over a 2 weeks period. The here uploaded dataset was used in the paper “Continuous Alertness Assessments: Using EOG Glasses to Unobtrusively Monitor Fatigue Levels In-The-Wild.” In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), May 4–9, 2019, Glasgow, Scotland Uk. ACM, New York, NY, USA, 12 pages. https://doi.org/10.1145/ 3290605.3300694.</p>https://zenodo.org/record/2532900#.XlhF30Mo92433c3 Talk: Beyond VR and AR2016-12-30T00:00:00+00:002016-12-30T00:00:00+00:00https://fkeel.github.io/posts/2016/12/30/33c3<p>Last year I gave a talk at the 33c3 about recent
trends in research beyond virtual and augmented reality.</p>
<p>Although most of the talk focuses on Superhuman Sports,
I go also into some details about enabeling technologies, mentioning smart eyewear
and also the Presto project.</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/8DUkpUrFwMA" frameborder="0" allowfullscreen=""></iframe>
<p>Here’s the abstract of the talk:</p>
<p>With recent development in capture technology,
preserving one’s’ daily experiences and one’s’ knowledge
becomes richer and more comprehensive. Furthermore,
new recording technologies beyond simple audio/video
recordings become available: 360° videos, tactile
recorders and even odor recorders are becoming available.
The new recording technology and the massive amounts of
data require new means for selecting, displaying and
sharing experiences.</p>
<p>Sharing experiences and knowledge have always been essential
for human development. They enable skill transfers and empathy.
Over history, mankind developed from oral traditions to
cultures of writing. With the ongoing digital revolution,
the hurdles to share knowledge and experiences vanish.
Already today it is, for example, technically feasible
to take and store 24/7 video recordings of one’s’ life.
While this example creates massive collections of data, it makes it even more challenging to share experiences and knowledge with others in meaningful ways.</p>Last year I gave a talk at the 33c3 about recent trends in research beyond virtual and augmented reality.JST Presto Project on Open Eyewear2016-11-10T00:00:00+00:002016-11-10T00:00:00+00:00https://fkeel.github.io/posts/2016/11/10/presto<p><span class="image left"><img src="/images/default.jpg" alt="" /></span>This project is interdisciplinary research focusing on
I'm exited to be one of few none-Japanese researchers to receive a JST Presto (Sakigake) project grant, on the Topic Open Collective Eyewear.
This page will be presenting updates and information about the project progress.
</p>
<p>Information unfortunately in Japanese:
<a href="https://www.jst.go.jp/kisoken/presto/news/2016/161118/161118presto.pdf">JST Annoucement</a>
Here’s a short summary about the project direction and goals.</p>
<p>Attention is a finite resource, and we need to use it smartly.
We need new tools to manage our attention better,
to improve our collective intelligence.
There are patterns in human physiological signals
(facial expressions, heart rate, nose temperature,
eye movements, blinks, etc.) that can reveal information
about intentions and cognitive functions of individuals
and groups. So far, this data is only heavily exploited by
advertisement and marketing companies. This project aims at
exploring these patterns using an Open Eyewear Platform to
understand our behavior better.</p>
<p><img src="/images/eyewear-overview.jpg" alt="overview" /></p>
<p>the use of patterns in physiological signals to quantify
and improve our daily practices using a smart glasses design.
First, we assess the link between behavior
patterns/physiological signals and social/cognitive
functions using specialized medical hardware (ground truth)
and we quantify them in real life (from the lab to everyday life).
In a second step, we explore interactions to improve behavior:
learn smarter, work smarter, live smarter.</p>
<p>Two publications that are the basis for the project and
a good summary of previous work:</p>
<hr />
<p><a href="/papers/bulling2016eyewear.pdf"><em>Eyewear computers for human-computer interaction</em></a>. Bulling, Andreas and Kunze, Kai. interactions 23, 3. 2016. <a href="/papers/bib/bulling2016eyewear.bib">Bibtex</a>. </p>
<hr />
<p><a href="/papers/amft2015making.pdf"><em>Making Regular Eyeglasses Smart</em></a>. Amft, Oliver and Wahl, Florian and Ishimaru, Shoya and Kunze, Kai. Pervasive Computing, IEEE. 2015. <a href="/papers/bib/amft2015making.bib">Bibtex</a>. </p>This project is interdisciplinary research focusing on I'm exited to be one of few none-Japanese researchers to receive a JST Presto (Sakigake) project grant, on the Topic Open Collective Eyewear. This page will be presenting updates and information about the project progress.Origins of Eyewear Computing2016-11-01T00:00:00+00:002016-11-01T00:00:00+00:00https://fkeel.github.io/posts/2016/11/01/background<p>Smart glasses and, in general, eyewear are a relatively novel device class with a lot of possibilities for unobtrusive human-computer interaction, sensing, and even human computer integration.</p>
<p>We started a while back working in the team of Masahiko Inami Sensei at the time at Keio Media Design in collaborative
research with J!NS. This lead to the first unobtrusive sensing glasses, J!NS MEME.</p>
<p>I cover a lot of the initial research in early talks at the Chaos Communication Congress. Here’s the presentation about the concept of Eyewear Computing.</p>
<p>Video on Youtube:</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/aWO8aejiRnA" frameborder="0" allowfullscreen=""> </iframe>
<p>Slides on Speakerdeck:
<script async="" class="speakerdeck-embed" data-id="f75c69a07f4c01328d155ab31af9093f" data-ratio="1.77777777777778" src="//speakerdeck.com/assets/embed.js"> </script></p>
<p>In addition, we also organized a <a href="http://www.dagstuhl.de/16042">Dagstuhl Seminar</a> on the topic.
<a href="http://drops.dagstuhl.de/opus/volltexte/2016/5820/pdf/dagrep_v006_i001_p160_s16042.pdf">Report and Summary of the Seminar</a></p>Smart glasses and, in general, eyewear are a relatively novel device class with a lot of possibilities for unobtrusive human-computer interaction, sensing, and even human computer integration.