Banner
Home      Log In      Contacts      FAQs      INSTICC Portal
 
Documents

Demos

Demonstrations provide researchers and practitioners with an exciting and interactive opportunity to present their systems, artifacts and/or research prototypes, either at a regular session or at the technical exhibition. In any case, it is required to avoid a commercial format, even if the demo consists of presenting a business product or service. Instead, the presentation should focus on technical aspects.
Any written support materials may be distributed locally but not published in the proceedings. Authors who already present a paper at the conference may apply for a demonstration, to complement but not to replace their paper presentation. Demonstrations can also be made by sponsor companies or as a mixed initiative involving researchers and industrial partners.
Demonstrations are based on an informal setting that encourages presenters and participants to engage in discussions about the presented work. This is an opportunity for the participants to disseminate practical results of their research and to network with other applied researchers or business partners.



Concerning the format of the demo, we can accommodate it either as a demonstration in a booth (physical area of 4 sq. meter, with a table and 2 chairs) at the exhibition area, as a poster or as a 20 min oral presentation at a session especially set up for demonstrations. It is also possible to organize the presentation of the same demo in more than one format. Please contact the event secretariat.

DEMOS LIST



Visual Large-Scale Industrial Interaction Processing


Lecturer

Gernot Stübl
Profactor GmbH
Austria
 
Brief Bio
DI Dr. Gernot Stübl is a Senior Scientist and Project Leader in the team of Visual Computing at Profactor, with profound background in industrial image processing and machine learning. He received his Diploma in Computer Science in 2008 and his doctoral degree in 2014, both with distinction. He is lecturer at the University of Applied Sciences of Upper Austria for Robotic Vision, and board member of the Austrian Association for Pattern Recognition. His research interests cover human machine interaction with special focus on applied deep learning technologies for perception.
This  work  investigates the  coordination  of  human-machine  interactions  from  a  bird’s-eye  view,  by  using  a single panoramic color camera. The approach replaces conventional physical hardware sensors, such as light barriers and switches, by location-aware virtual regions. Recent methods from the field of pose estimation to detect human and robot joint configurations are applied. By fusing 2D human pose information with prior scene knowledge, the system is able to lift these perceptions to a 3D metric space. In this way, it can to initiate environmental reactions induced by geometric events between humans, robots and virtual regions. The diverse application possibility and robustness is demonstrated life in three use cases.

Secretariat Contacts
e-mail: chira.secretariat@insticc.org

footer