Posters to EuroVR 2019

Last Call for Posters to EuroVR 2019:
16th EuroVR International Conference

23-25 October, 2019
TalTech Mektory, Tallinn, Estonia
https://eurovr2019.eu/
Deadlines:

  • Scientific Poster paper submissions: July 3, 2019 23:59 CEST
  • Notification to authorsJuly 19, 2019
  • Camera ready for accepted contributions: July 29, 2019 23:59 CEST

The Scientific proceedings  of EuroVR 2019, which will include the papers accompanying the scientific posters, will be edited by Springer as a volume of the Lecture Notes in Computer Science (LNCS) series. 

Submission guidelines can be found on the conference web page at:  https://eurovr2019.eu/call-for-papers/submission-guidelines-scientific-track/ 

All scientific contributions have to be uploaded to the Online Conference Service (OCS) of Springer. For the Scientific Poster papers, please use: https://ocs.springer.com/ocs/home/EuroVR2019_Scientific_poster_trackTo contact the posters co-chairs, please mail to eurovr_posters_2019 at cs.uni-bremen.deTo contact IPC chairs, please use OCS or ipcchairs.eurovr2019@taltech.eeFurther information on EuroVR 2019 are available here: https://eurovr2019.eu/

3D Everywhere: 2019 Web3D Conference

3D Everywhere: 2019 Web3D Conference

3D Everywhere: 2019 Web3D Conference to Address Innovations in Medicine, Graphics, Access

This year’s 24th Annual Web3D Conference will bring together experts from the fields of medicine, imaging, computer graphics, and health informatics to discuss the critical importance of 3D data and visualization in healthcare: from patient and provider education, to diagnosis, therapy, collaborative care, and medical decision-making. 

In this era of great transformation, there is also great opportunity. This year’s Program includes a Medical Web3D Workshop and Keynotes from two renowned researchers. Dr. Arno Harholt of ICT will present “Virtual Humans for the Web and Beyond” and Dr. Ed Hammond of Duke and HL7 will present “Toto, I have a feeling we are not in Kansas anymore. The world has changed”. 

Attend Web3D 2019 to connect and contribute to the new ecosystem of Medical Web3D for education & consent, diagnosis, simulation, therapy, and 3D printing! With the requirements of a Lifetime of 3D, durability and interoperability of information is at a premium in healthcare. Discover how ISO-IEC International Standards for 3D Graphics on the Web, can help you share your interactive 3D data between any source and delivery platform. 

Web3D 2019 will be held July 26-28 at the Hotel Indigo, Downtown Los Angeles. The Web3D Conference is sponsored by ACM SIGGRAPH and organized In Cooperation with The Web3D Consortium and EuroGraphics. Corporate Sponsors are The Khronos Group and 3DMD.com.

https://web3d2019.web3d.org/
Please see the Web3D website for more details.

Conferences: SCA2019 Early Registration Ends Soon

The 2019 Symposium on Computer Animation will be held at UCLA on July 26-28, 2019 (i.e., immediately preceding SIGGRAPH).

Early-bird registration for SCA2019 has been extended to June 28, 2019, but is fast approaching.
Register early and take advatange!

Registration Information:
https://sca2019.kaist.ac.kr/data/siggraph/websites/siggraph.org/public_html/registration-2/

The full program for the symposium has also recently been announced. In addition to a full complement of 18 regular papers and 5 invited journal papers, there will be a sketch session, a poster reception, a conference dinner, and a tremendous lineup of invited speakers:

Keynote Speakers: 
-Uri Ascher (UBC)
-L. Mahadevan (Harvard)

Invited Speakers: 
-Mridul Aanjaneya (Rutgers University)
-Chenfanfu Jiang, (University of Pennsylvania)
-Sophie Jörg, (Clemson University )
-Mélina Skouras, (Inria Grenoble Rhône-Alpes)
-Steve Tonneau, (University of Edinburgh)
-Etienne Vouga, (UT Austin)

Conference Program:
https://sca2019.kaist.ac.kr/data/siggraph/websites/siggraph.org/public_html/program/

Additional details about the conference can be found on the website: https://sca2019.kaist.ac.kr

Hope to see you in Los Angeles!

ACM SIGGRAPH Frontiers Workshops Announced

ACM SIGGRAPH Frontiers Workshops Announced

Greetings,

We are excited to announce the topics for this years ACM SIGGRAPH Frontiers Workshops:

Computer Graphics for Autonomous Vehicles
Content Generation for Workforce Training
Sim-to-Real: From Skilled Virtual Agents to Real-World Robots
Immersive Visualization
Cybersickness: Causes and Solutions
Textiles: Virtual to Actual

The ACM SIGGRAPH Frontiers Workshops showcase perspectives on emerging and adjacent areas of interest to the SIGGRAPH community.  These workshops are full-day explorations into complex new problems, providing a deep-dive for participants into areas like robotics, autonomous vehicles, textiles, assistive and adaptive technology, and manufacturing. All workshops will be held Sunday, 28 July, 9:00 AM – 5:00 PM.  Plan to participate and book your flights accordingly.

For more information see https://s2019.siggraph.org/conference/programs-events/organization-events/frontiers-workshops/

See you at the conference!
The ACM SIGGRAPH New Communities Strategy Team

Note: There is no additional charge for registered SIGGRAPH 2019 attendees to attend any of the ACM SIGGRAPH Frontiers Workshops. Lunch will be made available to workshop participants who are interested in additional networking time with other participants (space is limited). A lunch ticket may be purchased for $45 through the online registration system (https://register.rcsreg.com/r2/siggraph2019/). Deadline to secure your lunch is Monday, 8 July.

Sci-Tech Oscar Honors Revolutionary Facial Capture System

Sci-Tech Oscar Honors Revolutionary Facial Capture System

written by Melanie A. Farmer

When Paul Debevec was a post-doc at UC Berkeley in the 1990s, achieving realistic digital human faces was considered a “holy grail” goal in the field of visual effects. This challenge became an area of computer graphics that Debevec and others would work in for the next two decades. Lucky for many moviegoers, they stuck it out: through their revolutionary facial capture methods, memorable characters in such high-grossing films as Avatar and Spider-Man 2 were brought to life on the big screen.

This year, the Academy of Motion Picture Arts and Sciences awarded Debevec, Tim Hawkins,  Wan-Chun Ma, and Xueming Yu with a Technical Achievement Award (a Sci-Tech Oscar) for the invention of the Polarized Spherical Gradient Illumination facial appearance capture method and the design and engineering of the Light Stage X capture system. The innovative system has been used in over 40 feature films, including Maleficent, Furious 7, Oblivion and Blade Runner 2049.  It will also contribute to five new movies releasing this year, including Captain Marvel and Shazam.

Indeed, the team’s facial scanning technique coupled with the light stage production system has staked its claim as an industry standard in Hollywood.

“This technology has helped achieve some of the most realistic photoreal digital actors in movies to date,” says Debevec, senior staff engineering at Google VR. “It allows visual effects artists to build a digital character based on the face of the real actor which has all of the same facial shape, appearance, and expressive lines and wrinkles as the animated character. Without this level of detail, the digital actor may not look believable, which would take the audience out of the story.”

Debevec and his team first introduced the new facial capture technique in a paper at the 2007 Eurographics Symposium on Rendering. The team had been working on how to digitize the shape and appearance of a human face down to the level of the person’s skin pores and fine facial creases. Pre-dating the team’s technology, techniques to create digital faces required a lot of time and effort, with low-res results that did not produce realistic details. Debevec and colleagues developed a method to capture natural human faces digitally in a few seconds with their spherical light stage device by shooting a series of photos under special polarized lighting conditions.

“We first put polarizers on all the light stage lights in 2005 to see if we could cancel out the shine of the skin from every lighting direction at the same time,” notes Debevec. “Then, we figured out the gradient illumination patterns to compute surface orientations from the specular reflections the following year. By August 2006, we showed some initial results of the high-res facial scanning to friends at Weta Digital, and within a few weeks they were sending over Sam Worthington and Zoe Saldana, the stars of Avatar, to have their faces scanned with us.”

In addition to feature films, the team’s light stage technology has been used to create digital characters in a variety of video games.  At SIGGRAPH 2013’s Real-Time Live, Debevec’s team worked with Activision to demonstrate “Digital Ira”, one of the first photoreal digital characters achieved with real-time rendering.  Today, copies of the current USC ICT light stage system have been purchased and installed at both the Activision and Electronic Arts game studios in Los Angeles.  And in the team’s highest-profile scanning project, they were invited to build and operate a special light stage scanning system to digitize President Barack Obama in the State Dining Room of The White House in the summer of 2014.

The team’s scanning time is down to just a fraction of a second now.

Debevec, who has been an active ACM SIGGRAPH member for more than 20 years and the recipient of SIGGRAPH’s first Significant New Researcher Award, attended his first SIGGRAPH in 1994 while a summer intern at Paul Allen’s Interval Research Corporation.  Debevec published his first SIGGRAPH paper in 1996 on the topic of modeling and rendering architecture from photographs, and went on to use these techniques to direct the Electronic Theater film The Campanile Movie at SIGGRAPH 1997 which helped inspire the “Bullet-Time” shots in The Matrix. “And, I haven’t missed a SIGGRAPH since,” he says.

Debevec and team were presented with their Academy Technical Achievement Award at the Feb. 9, 2019 Sci-Tech awards ceremony held at the Beverly Wilshire Hotel. This marked Debevec’s second Academy Award; in 2010, the Academy presented him and his colleagues with a Scientific and Engineering award for the first generation of the light stage capture system and image-based relighting process used on films such as Spider-Man 2 and the Curious Case of Benjamin Button. “Receiving a second Academy Award is just as exciting as one would hope. The Academy Sci-Tech committee puts an enormous amount of effort into researching and validating which technologies and people should receive awards,” notes Debevec, “and they set a very high bar, so it is very gratifying