I have an aplication running on a virtual machine that has Windows Server 2019 as its OS and has its own IP address. The aplication being installed on the virtual machine requires the use of Websockets. However during the install process that application skips over the part that would normally allow me to enter Websocket information - IP address and whether Http or Https is going to be used. The virtual machine appears correctly configured for Websockets. The same application installs and runs perfectly on the main production platform. The only difference I can find is that the small test system we built is running on a virtual machine and on investigating I found that the host machine is running Windows Server 2008, which is pre Websockets. So the question is - is a virtual machine self contained and just passes encapsulated data to the layers responsible for transmitting data on the host machine or would it try and use the host machine to run Websockets?
I suspect that the application checks for Websocket capability during the instal and is unable to get a satisfactory result and so skips that part. An error message would have been much more helpful than just skipping part of the installation process.
Any thoughts ?