Hologram Stability on HoloLens2

Garrett Isaacs 21 Reputation points
2020-10-01T20:30:10.98+00:00

Hey all,

I'm working on the HoloLens2 and I'm trying to get remote rendering set up in preparation for the potential of high polygon models needing to be rendered. The model I'm trying to render now has ~570k polygons. When I render it into the Untiy scene, it looks fine standing still, but as I walk around it it seems to shake and tear, akin to old scifi movie holograms. I've checked my network settings and they fall within the recommended parameters, and the device is able to not sake the model when prebuilt into the scene. Given the circumstances depth LSR is the way to go it seems, and I'm working with the free tier of the remote rendering service. Any advice on how to clean up this model during use?

Azure Remote Rendering
Azure Remote Rendering
An Azure service that renders high-quality, interactive three-dimensional content and streams it to edge devices in real time.
32 questions
{count} vote

2 answers

Sort by: Most helpful
  1. Christopher Manthei 251 Reputation points Microsoft Employee
    2020-10-07T16:41:17.043+00:00

    Hi @Garrett Isaacs ,

    If I understand you correctly you, you want to use planar LSR to in combination with ARR?
    The following page describes the different options and requirements that you need to meet for both LSR techniques: https://video2.skills-academy.com/en-us/azure/remote-rendering/overview/features/late-stage-reprojection

    At mentioned on the page, you will need to retrieve the remote focus point on your own and set it in your app. This would be the bare minimum of what you would need to do:

    using Microsoft.Azure.RemoteRendering;  
    using Microsoft.Azure.RemoteRendering.Unity;  
    using System;  
    using UnityEngine;  
      
    public class UpdateFocusPoint : MonoBehaviour  
    {  
        // Update is called once per frame  
        void Update()  
        {  
            FocusPointResult validResult = FocusPointResult.Invalid;  
            Float3 position = new Float3();  
            Float3 normal = new Float3();  
            Float3 velocity = new Float3();  
      
            // Obtain the focus point using the same coordinate system that is used as user coordinate system.  
            IntPtr ptr = UnityEngine.XR.WSA.WorldManager.GetNativeISpatialCoordinateSystemPtr();  
            if (ptr != IntPtr.Zero)  
            {  
                validResult = RemoteManagerUnity.CurrentSession.GraphicsBinding.GetRemoteFocusPoint(ptr, out position, out normal, out velocity);  
            }  
      
            // Use the remote focus point whenever the data is valid, i.e., a proper focus point or a fallback one.  
            // In case it is invalid, we do not set a focus point and instead rely on Mirage to select a sensible fallback.  
            if (validResult != FocusPointResult.Invalid)  
            {  
                UnityEngine.XR.WSA.HolographicSettings.SetFocusPointForFrame(CommonExtensions.toUnity(position), CommonExtensions.toUnity(normal), CommonExtensions.toUnity(velocity));  
            }  
        }  
    }  
    

    For a production solution you would want to implement additional things though:

    1. Compute the focus point for local content (for example by a raycast in Unity) and take the closest one between local and remote focus point.
    2. The remote focus point is only updated every second or so thus it is recommended to interpolate the focus point over time instead of always taking the newest one. Taking discontinuities or app specific features into account.

    However, these are quite application dependent so we can't provide you with a general code sample that would work in every situation.

    Cheers,
    Christopher

    1 person found this answer helpful.

  2. Christopher Manthei 251 Reputation points Microsoft Employee
    2020-10-08T16:05:13.93+00:00

    @Garrett Isaacs , in case your questions does not refer to using planar LSR, can you answer the following questions?

    1. Is this your own application or one of our samples?
    2. Do you have the same issue when you load our sample engine model in our Unity Showcase or Quickstart apps?
    3. Does your model contain transparencies? Transparent objects do not write to the depth buffer so they can't be reprojected.
    4. Can you post a screenshot of the statistics panel in the Showcase or Quickstart app so we can rule out general network issues?
    5. If hologram stability is generally poor, even in our sample app, can you create a trace and send it to us?

    Thanks,
    Christopher