Please use Chrome or Firefox. IE explorer will encounter some format issue.
Please use Chrome or Firefox. IE explorer will encounter some format issue.
Next AR was the project team I was in during my stay in Electronic Arts. I worked with two designers, one producer, and one other engineer, to create a passthrough AR experience by using Unreal 4. This was the group picture of my team, and I was the rightmost guy.
Next AR is an R & D project sponsored both by Carnegie Mellon University and Electronic Arts in Redwood City, California. Its purpose is to create a pass through AR experience with social events. Players in different physical locations use hand gestures to interact with other players and objects through the Internet. Below is the trailer video of the project.
Here is a demo video on some tech stacks I have implemented. I experimented with the hardware's limits and played around with the networking feature of the Unreal Engine.
The Director of ETC's Silicon Valley campus. Creator of the Pacific Data Images.
A game producer, designer. Shipped the Sims Franchise and its expansion pack series in EA.
Has stable tracking ability.
The depth camera mounted on Vive.
Hand gestures detector on Vive.
To incorporate both ZED camera and Leap Motion on Vive, I compiled a customized version of Unreal 4.21, which allowed ZED camera to use its rendering pipeline. During this phase, I spent plenty of time testing with the hardware and their APIs. To the left is a short video of me presenting hardware setup to EA faculties.
I created a simple interactive single-player demo after the hardware had been dealt with. In this demo, players use their hands to grab and throw cubes. Also, I coded the finger drawing feature, which is popular in VR and AR. To the right is a short video of the single player demo.
Since having social elements is one of the client's requirements, our group's next step is to add a networking feature. I tried to use the default client-server network model from Unreal 4, and it worked. However, it is laggy for the moment. You can find the demo video to the left.
Fixing the laggy network became my next priority. Instead of using the default client-server model, I forced some network logics to run on client-side and inform the changes to the server. In this case, an individual client did not need to wait for the server to respond as logic was calculated locally. The server then broadcasted to all other clients about the changes.
The half milestone was closing, and we want to add realistic physics to our experience to foster social activities with two people. Instead of using fixed constrains for grabbing objects, I integrated a free physics plugin made for controllers and integrated it onto Leap Motion. Plugin website is here: https://vreue4.com
To the left is the brand-new physics demo video.
During the half milestone, people from Electronic Arts headquarter play-tested the project demo. They provided precious feedback and were surprised to see the network feature implemented. To the right is the video of play-testing footage.
Up till now, I have been building the tech stacks. The next step is to figure out how we will utilize the AR feature and tech stacks to create a social AR experience. Challenge waits ahead.
With all the tech stacks at hand, our team decides to create a virtual BBQ party where players in different physical locations can drink wine, build a campfire, and grill food together via the Internet. It turns out that our clients like it a lot because our project enables people to be social with others, especially during the COVID quarantine.
Copyright © 2023 Zhisheng Xu - All Rights Reserved.
Some videos use loyalty free music from Bensound.