3D vs AR Visual Differences

Why Does My Product Look Different in AR compared to a 3D visual configurator?

Dopple-Blog-3D-vs-AR-Visual-Differences-Images-1

3D visualization is a powerful tool for brands looking to elevate their digital efforts. Bringing products to life online with high fidelity digital twins creates a compelling experience for buyers and demonstrates the exact form, function, and quality they can expect after purchase.

 

Augmented reality (AR) is being paired with 3D visualizers and configurations at an increasing frequency, extending the immersive buying experience and further amplifying the associated business results. However, as product managers dive deeper into these experiences, they often find differences in the visual parity within a configurator versus what comes to life through AR. While the technology is quickly evolving on both sides, there are several factors that must be taken into consideration when comparing a 3D product experience to one in AR.

 

Will my 3D product look the same in Augmented Reality? What are the differences between 3D and AR?


Rendering techniques & mobile software
AR will have visual differences from 3D depending on the techniques leveraged in art model creation and due to the rendering technologies and the way they handle lighting. There will also be varying differences between IOS and Android you can anticipate. AR technologies, while currently limited in the consumer market, are improving and our team has been actively exploring methods to bring more visual consistency to the AR experience through cutting edge AR implementation.

AR software is limited by the mobile and the software on them. These limitations include the web browser and mobile software standardization and this can affect visual parity in lighting, tone and color acuity between our web experience and the AR experience. This includes the step of the AR experience when you are launching AR as this is powered by Google Model Viewer and Apple iOS, not Dopple.

Why Does My Product Look Different in AR?

Lighting

Unlike a web page based experience or virtual reality where an object is placed into a controlled environment, AR is the integration of digital information with the user’s current environment. Where conditions such as lighting and tone can be built into a desktop or mobile experience, those conditions cannot be passed on to AR. Augmented reality uses real-time lighting in the user’s physical environment. A bright sunny room will affect how a product is viewed in AR, as will a dimly lit space or lighting from various angles. Dopple’s 3D desktop web experience uses a controlled global lighting setup which is not included once it passes through to AR.

Dopple-Blog-3D-vs-AR-Visual-Differences-Images-2

Rendering Capabilities

Native WebAR capabilities are now common among the leading smart devices; however because they are built by different developers, the manner in which each device processes the information that makes up an AR experience may vary. Variations in texture compression and lighting approximation will produce a variable end result that is not controlled by the experience builder.

As we operate today, there are core limitations in AR that limit the degree of fidelity that can be achieved through WebAR, largely due to material extensions that are not transferable to an AR environment. Fortunately, the 3D development community is actively building new capabilities and closing the gap between browser-based and AR experiences.


Augmented Reality: Native vs. Web.

The AR experience on websites also has differences from an AR experience in a native app. We deliver experiences according to web standards for a low-friction AR experience, but we watch native developments as an indication of what may be available in the future.

Mobile devices with native platforms such as Google Android or Apple iOS have access to core functions and features as provided by a given hardware and software specification. These features and specifications are defined by the manufacturer and developer of each platform. Native apps on a mobile device App store have greater control and access to these features.

The World Wide Web, however, is a universal platform with standards that are largely dictated and agreed upon by a larger conglomerate of companies and individuals. It is this standardization that both drives consistency across general web experiences, as well as the software development to support it. As these standards and protocols evolve, so too, does the software that supports it. It is this universal support, that also limits development beyond the confines of being a local experiment. Even so, in many cases, there are often competing technologies vying for the accepted Standard, in which case the market is divided. Remember VHS vs. BETA? or perhaps Blu-ray vs. HDDVD?

The closed platforms such as Android and iOS provide robust experiences and larger native feature sets because they are not limited to a universal mode of operations. They maintain their own standards within their ecosystem. This has brought significant improvements to Augmented Reality with many modern native apps taking advantage of better features for tracking faces, body parts, and objects.

The world has been enjoying these features amongst popular social media apps, as well as virtually shopping for that next pair of eyeglasses. The downside is requiring an app download. Apps that require a download for AR gain fewer users due to the friction of downloading.

The AR experience on the Web, however, is limited to current web browsers and web standards as currently implemented for release. Despite using a limited set of AR functions of any given device, access to lower-level features such as face tracking, can only become available when Web Browsers support it. That support will come as the standards for the technology improves and the software begins to support it.

WebXR is the next step to providing a deeper virtual experience and continues to be developed today.