Traveling to CHI 2016, I finally got some time to write a bit more again. Sorry for the sporadic updates I promise improvements.
As mentioned our seminar on Eyewear Computing was quite different from some other seminars, as we focused more on hands-on tutorials and prototyping than on talks and academic discussions. The seminar included workshops and tutorials on eye tracking (optical and EOG), egocentric vision, optics, and head-mounted displays.
Some of the ideas regarding schedule and structure, we “adopted” and “enhanced” from the excellent Seminar on Life-long Health Behavior-change Technologies. The most important part, we had only little influence on, is to get great participants. Here we were really lucky (as you might also recognize when the seminar documentation is published ;) ).
In the following, I’ll go over the characteristics what made the seminar special, discussing what worked and didn’t work during structuring and conducting the seminar.
#short talks We limited the introduction talks to 3 minutes per person each and static slides. Everybody presented us with a pdf before the seminar started. The slides were combined in one larger pdf and we use automatic timing to step through the slides ;) This was the first seminar I attended with no delays regarding the introduction talks, as we eliminated problems with powerpoint or other presentation software using pdf only ;)
Suggestions from participants: Change the timing slightly to give junior participants more time (e.g. 5 min junior, 3 min senior). Other ideas included an internal poster session for junior participants, so they can gain even more face-to-face time with senior researchers.
#workshops and tutorials As mentioned before we organized a couple of hands-on tutorials, this meant an additional burden for the workshop/tutorial organizers as they needed to bring hardware etc. However, it was very well recevied and in addition it stimulated research discussions with quick hack sessions to try out ideas. This was the first Dagstuhl Seminar where we could start recording some initial datasets for exploring concrete research ideas. A special thanks to Moritz Kassner Will Patera, Scott Grenwald, Shoya Ishimaru, Yuji Uema, Koichi Kise, Kiyoshi Kiyokawa (Osaka University), Thad Starner, James M. Rehg and Kristen Grauman for organizing the hands-on sessions.
#pre-workshop paper collection We asked participants to send us a list of their preferred reading, top papers related to the topic of the seminar beforehand. Three separate requests:
- 1-5 papers they find extraordinary regarding the topic not authored by them or their colleagues
- 1-5 papers authored by the participant relevant and important to eye wear computing.
- any other reference (book, article, app, tool) related to the topic they want to share with other particpants.
This gave us a great reference list before the seminar and helped us structure the tutorials and workshops as well as other sessions.
#topics for breakout sessions The only direct criticism we received related to the breakout session topics. The topic selections seemed for some participants too ad-hoc and random.
Participants suggested to include the topic finding in the preparation phase (similar to the paper collection). Also including a “poster session” from junior researchers was considered as a better starting point to organize topics for breakout sessions.
#documentation We reserved a couple of ours of the second last and the last day for writing documentation and notes.
Also assigning a note taker upfront for each of the sessions helped distribute the responsibility and created a great backlog on information. I find myself going back to the notes several times already writing articles etc.
#final thoughts Thanks again to Andreas Bulling, Ozan Cakmakci and Jim Rehg; amazing co-organizers. It’s impressive for me to see how much time they made in their busy schedules and how professional they worked on the seminar. I was able to learn a lot by observing them and it was a lot of fun to work together.
The summary report of the seminar should soon be out.