SBIR-STTR Award

Plug and Play Characters for 3D Virtual Environments
Award last edited on: 12/28/2023

Sponsored Program
SBIR
Awarding Agency
NSF
Total Award Amount
$650,000
Award Phase
2
Solicitation Topic Code
IC
Principal Investigator
Okan Arikan

Company Information

Animeeple Inc

146 Montelena Court
Mountain View, CA 94040
   (650) 417-5020
   contact@animeeple.com
   www.animeeple.com
Location: Single
Congr. District: 18
County: Santa Clara

Phase I

Contract Number: 1014407
Start Date: 7/1/2010    Completed: 12/31/2010
Phase I year
2010
Phase I Amount
$150,000
This Small Business Innovation Research (SBIR) Phase I project will evaluate the feasibility of creating reusable, self-encapsulated animated characters for use in 3D virtual environments, such as those found in training simulations, video games, virtual worlds, and other 3D applications. Animated characters are a major component of many virtual environments, but are difficult to develop. Characters move and interact with each other in complex ways, and must autonomously make rational decisions. Unnatural movement or behavior can destroy a virtual environment's believability. This project aims to create a reusable character animation component that can be integrated into different virtual environments using a high-level interface to communicate with the host application. The innovation behind this high-level interface is a novel technique that unifies control demands, environmental constraints, environmental interactions, and inter-character interactions into a single framework. The technique uses a differential representation of animation to enforce constraints while retaining small-scale details that are key to realism. Virtual environments are expensive to build. Since nearly half of the cost of a virtual environment is spent on artwork and engineering, it makes sense to reuse as much as possible. Reuse could dramatically lower costs. In practice though, very little is reusable. Current character animation middleware products (i.e., third-party character animation software components) are not truly application-independent. They are often intimately tied to the host application's logic so that they can support application-specific features. This project will evaluate the commercial feasibility of offering application-independent, reusable character animation middleware as part of a complete virtual character platform. This platform will offer end-consumers customizable 3D avatars that can be used in any application that supports the proposed interface. If adopted, reusable character assets and application-independent middleware will reduce development time and cost, stimulating the creation of new applications for training, entertainment, communication, and education

Phase II

Contract Number: 1127499
Start Date: 9/1/2011    Completed: 8/31/2013
Phase II year
2011
Phase II Amount
$500,000
This Small Business Innovation Research (SBIR) Phase II project will complete the development of reusable, self-encapsulated animated characters for use in 3D virtual environments. 3D characters are difficult to develop because of the inflexibility of current motion representations. Presently animations are compiled into characters using a static data structure. This makes the addition of new animations an off-line, time-consuming process. Using extensive motion annotation, our technology allows applications to link together animations at run-time. The end-product objective is a network of 3D mobile applications that run on multitouch-enabled devices like smart phones. The technology enables 1) transfer of characters within a growing network of applications (i.e., 'plug and play' characters), 2) user selection of animations to use in each application, and 3) character control through a novel multitouch-based interface. Intellectual merits involve creation of a character authoring and control interface, and analysis of alternative, flexible representations of character animation at the semantic level. This project will have broader impact in three areas. First, multitouch is a new interactive paradigm that will become ubiquitous through the proliferation of smart phones and tablets. This project will investigate multitouch schemes for intuitive control of complex, articulated models such as 3D humanoid figures. Second, this project will advance our understanding of semantic categories for human motion. Such labels are important for motion synthesis and motion recognition. Third, this project will develop methods for building virtual environments incrementally. Virtual environments are used widely in entertainment, training simulations, virtual worlds, and other 3D applications. The company will develop technology for adding new assets in a scalable manner