SBIR-STTR Award

Emotionally Immersive Tele-Learning
Award last edited on: 8/12/2016

Sponsored Program
SBIR
Awarding Agency
NSF
Total Award Amount
$1,203,057
Award Phase
2
Solicitation Topic Code
-----

Principal Investigator
Ian Bennett

Company Information

The Spirituality Network Inc

211 Cleveland Court
Mill Valley, CA 94941
   (415) 568-1068
   N/A
   www.thespiritualitynetwork.com
Location: Single
Congr. District: 02
County: Marin

Phase I

Contract Number: ----------
Start Date: ----    Completed: ----
Phase I year
2012
Phase I Amount
$155,000
This Small Business Innovation Research (SBIR) Phase I project aims to incorporate novel machine vision functionality and innovative social networking capabilities into the technology of distance learning and online webinars. The primary objective is to make on-line training and virtual collaboration more engaging and compelling by replicating non-verbal feedback related to the rate and acceptance of information delivery in lectures, in order to make the experience of distance learning more emotionally immersive. This project contributes four significant innovations: 1) a machine-vision recognition system for head position, gaze direction, facial expressions of interest or comprehension, which when averaged across participants will provide simple feedback related to the rate and acceptance of information delivery; 2) a machine-vision-based hand detection system for motion and shape to detect hand raising or other gestures; 3) to enable hot-deployable third party pedagogical applications within the framework, aka "side apps"; and 4) to integrate social functionalities that replicate pre- and post-lecture socialization including pair-sharing, breakout groups, team teaching, and support for teaching assistance. In anticipation of support for this project, TSN has already built an evaluation test bed for tele-lectures and virtual classrooms. The broader impact/commercial potential of this project is to significantly transform on-line education. On-line training also has the potential to radically alter the delivery of education. By 2018, the estimated cost of four-year public university education is expected to rise to $151,000. For private colleges, this cost will increase to over $300,000. To address this crisis, several colleges and startup companies have announced an increased use of on-line training. However existing systems for streaming video for lectures, and virtual group learning environments, have not advanced to the level that distance learning isn't considered to be a second-class citizen in the educational world. The proposed system can transform the fundamental efficacy of on-line training and spur new research.

Phase II

Contract Number: ----------
Start Date: ----    Completed: ----
Phase II year
2014
(last award dollars: 2016)
Phase II Amount
$1,048,057

This SBIR Phase II project aims to incorporate novel machine vision and social networking functionality into technologies used in online education and webinars. Researchers have long identified engagement as the key ingredient for success in any learning environment and particularly the online environment, but current online teaching systems lack the means by which instructors can gauge their students' level of engagement because these students are not visible. Therefore in order to improve current online teaching modalities, it is necessary to find ways to communicate to the instructor the level of engagement of their unseen online students. The successful outcome of the project will allow the lecturer to receive real-time feedback from facial expressions, gaze and other body kinesics, which when averaged across the virtual classroom, provides feedback related to the reception of information delivery. The project supports NSF's mission in education, which seeks to answer questions about how teachers can provide effective cognitive and motivational support for students. This project aims to promote richer interactions between tutor and online students through integrated cognitive and motivational scaffolding, leading to higher levels of student success enabling them to compete more effectively as skilled artisans in the 21st century workforce. This project contributes four significant innovations: a machine-vision recognition system for gaze direction and facial expressions of engagement, which aggregates data across participants to improve signal to noise; a machine-vision recognition system for detecting hand gestures and postural kinesics; enabling third party pedagogical applications within a framework using ancillary hardware, and which can be sold through an educational application store; and to integrate social network functionality that replicates pre- and post-lecture socialization including pair sharing, breakout groups, team teaching, and support for online teaching assistance. The goals and scope of research required to support the above innovations include: improving machine-vision classifier data and functionality; optimization of user interface and integrated user calibration process; extending the system's functionality to both synchronous and asynchronous modalities; developing neuro-psychological and cognitive models of the online pedagogical process; design and integration of content/media interoperability and social networking capabilities; creation of open application programming interfaces for third-party developers and application store functionality with digital whiteboards and tablets; development of ancillary software modules to help students manage data related to their educational efforts, diagnose study and achievement patterns, and provide expert advice based on that data.