AppliedVR
A new way of visualizing construction
As a founding member of the appliedvr team at Google, our goal was to use VR to improve current building design and construction processes of data centers and workspaces.
I led art and technical art direction for the team and built pipelines for engineering in applying CG based workflows into automated cloud processes. With parallel computing in the cloud we were able to reduce time and rework of creating real-time ready assets from weeks to hours at the click of a button. The scope of work ranged from fully automated to bespoke custom VR scenes, both with different workstreams and requirements. UX and interactions were built out as common components, but new interactions often were made to best accomplish users needs.
Scale and flexibility were key to building applicable VR simulations for our users. The following sections outline different product offerings and pipeline processes that our team built.
Art Production Pipeline
CAD sourced models from various tools.
Work with designers and content experts to get correct data exported to work with our content creation tools:
USD
FBX
Maya
PiXYZ
Substance Tools
Zbrush
Adobe suite
Optimization through automation.
Create meshes for the level of detail needed for the project requirements. Can be run locally or in the cloud:
tessellation
removing holes
deleting hidden
defeaturing
decimating
etc.
Fix outliers.
Sometimes parts of the model need to be fixed by hand because the source data was insufficient. This is usually done in:
Maya
PiXYZ
Zbrush
Unreal Datasmith
Assemble.
Combine all the details from models to materials in Maya.
Materials are usually created in Substance Designer and Painter using a non destructive workflow.
Create different LODs as needed.
Deploy To Render Engine.
80% should be done in the Maya scene and materials are reconfigured for each engine.
Lighting, tools, and game interactions are added, then we build for target platform.
Light Field Technology
Seraut was used to create our light fields. This process uses images and depth data to generate proxy meshes/cards from a 1m³ headbox area then bakes images as texture to the proxy faces thus giving 6dof parallax in the headbox. This allows the light fields to be viewed on multiple headset platforms, even mobile devices.
Visual fidelity is great, but limited movement to the headbox area.
Cloud Processing Pipeline
Some goals of the cloud pipeline:
Break up large model into smaller chunks
Run operations in a container environment
Distribute work across multiple processes
Use Google Cloud Platform to scale reliably and securely
I configured all 3D production related components and their software configurations to work modularly in a cloud distributed pipeline that enabled automation for customers.
This included software environment configuration to work with automated tasks I created in digital content creation tools used in the pipeline.
Also created the license servers and deployments for all third party software with corresponding network configurations to conform to security guidelines.
This also included creating GPU virtual workstations and configuring each vm environment to test that workflows would work on cloud hardware/setup before pushing to production.
I also created a deployment strategy via kubernetes using cloud orchestrate, shared storage solutions, and helping improve the tools built to scale workloads. These workstations have become more used across different projects at Google and have helped me learn a lot of what’s possible on cloud technologies.
Cloud Processing Pipeline Technologies
USD
Scalable non destructive interchange format between stages
UNITY/Unreal Engine
Runtime clients
FBX SDK
Interchange format used by legacy CAD software and game engines
dsub
Open source command-line tool to run batch computing tasks and workflows on in a cloud environment
Pixyz
Tessellation and decimation solver
Google Cloud Platform
Google Cloud Storage(GCS) and Google Compute Engine(GCE) to drive our cloud processes
MAYA
Cleanup, dispersal, chopping and combining mesh data
Docker
Run operations in a container environments on GCP