Approximating Euler’s Number Using a Random Number Generator

I stumbled upon a mathematical curiosity on Twitter:

Pick random numbers between 0 and 1, until the sum is greater than 1. Repeat forever. The average number of numbers picked will be Euler’s number (e=2.718281828…)

Here is the proof: twitter.com/fermatslibrary/status/961238636418293763

The kids wanted to try, so we coded it in C#:

 

Here is a snippet of output:

Avg:2.70660032651792 Error:-0.0116815019411254
Avg:2.70662313432836 Error:-0.0116586941306869
Avg:2.70664593859308 Error:-0.0116358898659632
Avg:2.70666873931292 Error:-0.0116130891461275
Avg:2.70669153648869 Error:-0.0115902919703532

After a few thousand iterations the error went down to around 0.0001

It then occurred to me that this function might be an easy way to test the randomness of a Random Number Generator.

What do you think?

 

Zed Mini – Unboxing and a Quick Review

In The Box

My two Zed Mini cameras arrived today, after a bit of a delay (pre-order placed sometime in Oct 17, was scheduled to ship in Nov. Included in the box:

IMG_20180129_172432.jpg

  • Camera
  • Lens cap
  • Bracket for the HTC Vive
  • Paper alignment helper thingy
  • Short USB-C cable (1m)
  • Long USB-C cable (~4m, not pictured)

Software

The Getting Started page includes links to the ZED SDK and the Hello World sample app, Zed World. ZED SDK requires CUDA 9.1, which it tries to install. For some reason, that part of the installation failed. I had to download CUDA from NVidia and install it myself, which worked fine. The ZED SDK comes with a few app that help you test the camera, the first to run is Diagnostics. My computer is an Alienware R15 laptop with a GTX1070 graphics card, two USB ports and only one of them 3.0.

The diagnostics app shows me this:

Capture.PNG

I’m not sure what this means, it seems to be contradicting itself. The camera is both OK and not detected. Hmmm..

The other Zed apps are similar to the apps that come with the ZED camera (the Non-mini version), and allow you to see the depth map.

Capsdture.PNG

As you can see, the camera has problems with featureless surfaces, such as walls, and tends to “fill in” gaps between objects (in this case, my head and arm). You’ll see more of this in the video below.

Technical ProblemsCaptuhre.PNG

  • Crashes: The system is very unstable. It crashed multiple times during testing, often requiring unplugging and re-plugging the camera.
  • Cable direction: The USB-C connector is reversible, but only one orientation works (it has a little arrow on that side). I’m confused as to why, I thought USB-C was fully reversible, but it doesn’t seem to be the case here.
  • I couldn’t get the long cable to work at all. This is a huge limitation for me. The troubleshooting guide basically tells you to try another port (my laptop only has one), not to use a hub (I’m not) and to switch to the short cable if all else fails. Right. Thanks.
  • I couldn’t get ZED SDK to run on my Windows 7 desktop. It kills the OS (wouldn’t load). I had to restore previous known good configuration to revive it (yes, I made sure to unplug the camera when I booted the system, didn’t help).

 

Performance

Here is the Zed World sample app. You can move/rotate/scale the objects with the controllers’ touchpad and the trigger buttons.

 

A few things you can’t see in the video:

  • The camera view has some latency, it’s very small but when you are moving your head it’s noticeable.
  • Due to parallax, objects tend to “swim” and shift when you look around. It’s really disorienting and… well.. weird. Very very weird.
  • The camera FOV doesn’t cover the entire Vive FOV, but about 50% of it. It’s about the size of a tissue box when held at 15cm / 6″ from your face.
  • The camera IPD is a bit smaller than my eye’s IPD, which also contributes to the overall weird feeling.
  • I started feeling queasy about 10 minutes in, due to latency and the weird way objects swim around when you move your head

 

Conclusion

I have really mixed feelings about this camera, and more testing is require. I’m going to contact support about the USB problem and the long cable not working, because that’s really limiting. I’m also going to try to install Windows 10 on my desktop and test it there.

When it works, when that little planet flies through your hand, it’s incredibly satisfying. But then it often doesn’t work, it crashes, it gets the depth map wrong, etc. I hope the drivers can be fine tuned and improved, but some problems can’t be fixed (for example, the fact that the camera is located a few cm from your eyes, causing weird things to happen when you move your head).

Finger tracking (similar to the Leap Motion) is definitely out of the question. The ZED Mini fails to separate fingers from background more often than not.

I’m still excited for the possible applications, specifically, bringing your hands into the VR application. Using the ZES Mini as a poor-man’s AR might be problematic. We’ll see.

I will update this review as more info arrives and I do further testing.

Shachar “Vice” Weis is a VR/AR developer and the founder of Packet39.com

The Accommodation-Vergence conflict and how it affects your kids (and yourself)

Introduction

Remember that first time you put on a VR headset? You were immersed in a virtual world and (hopefully) it was breath taking. Then, if you were in VR for a good amount of tine, when you took off the headset the real world felt… weird. Not right. Something odd about it. That feeling is very common for first-timers and usually goes away within a few minutes. That weirdness is your brain and eyes re-adjusting and switching back to their normal way of operating, after being forced to work in a very unusual manner.

So what is going on?

It’s common knowledge that depth perception stems from the fact that we have two eyes, and we can infer depth from the displacement of objects between the two images. Large displacement (parallax) means close object, small or no displacement means far away object. This is true, but there much more to it. There are in fact a whole bunch of cues that we use to infer depth. Lets talk about two of these.

Accommodation

Eyes are a marvel of engineering. Camera lenses have multiple optical elements, some are movable. To change focus, the elements move in relation to one another. Eyes have only a single element (the lens), but unlike cameras, our lens is soft and squishy. To change focus, special muscles pull or relax the lens, changing it’s shape.

 

This is called the Accommodation reflex and it’s an important depth cue for objects that are close to us (less than 10 meters / 30 feet).

Convergence

Our eyes can move independently, and when looking at an object they converge on it.
Untitled-2.png

Again, this is another depth cue.

The Conflict

In the real world, Accommodation and Vergence are always in sync. VR is a different story. In a VR headset the screen is in a fixed distance from your eyes, usually around 3 meters (9 feet). The actual screen is of course much closer, but there are some optics in the headset to make it appear as if it’s 3 meters away. Your eyes remain focused on the screen all the time. If a virtual object is a 20cm away, your eyes are converging on a spot that is 20cm away but they are focusing (accommodating) on the screen which is 3 meters away. This is totally wrong and never happens anywhere except VR headsets.

 

fig2

If you have a VR headset you can easily test this. Go into a VR environment and look at something on the horizon or far away. Now hold up the controller to your face. Note how both the controller and the horizon are in perfect focus. This is optically impossible and your brain is actually freaking out over it.

But wait! Surely 3D movies / TVs also have this problem, and they have been around for a long time. This is correct. However, the big difference is that 3D displays don’t block out the real world and are usually far away from the user. They also cause this conflict, but to a much lesser degree.

Panic

As stated, the Accommodation-Vergence conflict is the reason why the real world feels weird after being in VR. It’s also the main cause of eye-strain / headaches, and some people are more sensitive than others. I’ve seen reports of VR users losing depth perception for hours, and even days, after a long VR session. While in adults this is probably temporary, it does raise concerns in regards to children.

How does prolonged exposure to the Accommodation-Vergence conflict affect a developing visual system (ie, kids)? Nobody knows. There’s no real data or studies on this topic, which brings me to the most important conclusion of this article:

Limit children’s exposure to VR. Until we know more, it’s best to err on the side of caution. Short exposure is probably fine, but don’t let them spend hours in there, and insist on frequent breaks.

What about VR in education? Yes, this is a big problem. On one hand, VR can be an amazing tool in education, on the other, it also might cause permanent vision problems. Moderation is key, and if you are an educator I will lave this decision in your hands.

I’m aware of this study, which showed a small improvement in vision when they tested a small group of kids for a short period of time. However, they did not test depth perception, which is the main area of concern.

Don’t Panic

The good news is that the Accommodation-Vergence conflict is solvable, and the solution is lightfield technology. I won’t go into much detail here, lightfields are damn complicated, but I will say what they solve the conflict and will allow for multiple planes of focus and more natural eye behavior in VR. However, the technology is still in it’s infancy and it will take 3-5 years before we see it in consumer VR headsets.

Written by Shachar “Vice” Weis, founder and CEO of Packet39, Virtual Reality training solutions for the manufacturing and power industries.

Further Reading

https://www.scientificamerican.com/article/are-virtual-reality-headsets-safe-for-children/

https://www.theguardian.com/technology/2017/oct/28/virtual-reality-headset-children-cognitive-problems

https://www.wareable.com/vr/safe-kids-vr-headsets-eyes-development-888

I installed a TPCast, was blown away, and then went back to tethered. A very subjective review.

A few days ago I had the opportunity to grab a TPCast and install it on my home office Vive. Here are my thoughts:

  • Lots of parts, clunky. Battery+holder, receiver, transmitter, router (why?)
  • Setup was easier than expected. Pretty much plug in everything, install the TPCast software.
  • Had some issues that went away after a reboot
  • Exhilarating feeling of freedom without the tether. Mind blowing really.
  • Very quickly realized I have nowhere to keep the battery. I usually VR in my underwear
  • Transmitter makes a very annoying squeal / buzz when it’s working
  • I see a green bar at the very edge of peripheral vision on the right side. I increased my FOV on my Vive by using a thin facepad. With the stock facepad the green bar is not visible, but it’s really hard to go back to stock FOV.
  • I did not see a reduction in image quality or lag
  • I did notice some jitterness with motion-intensive games like SoundBoxing. I can’t put my finger on exactly what is the difference, but it’s not as smooth as tethered. But again, only with very active games.
  • The Vive microphone is not supported, nor is the headset camera or extra USB port.

I’m now back to good old tether. Why? It’s just.. clunky. A collection of small annoyances. The battery in my pocket. The annoying buzz. The green bar glowing just outside my view. The extra tripod for the transmitter and extra cables for the router. Having to charge my headset. None of these is a huge deal, but put together.. I decided to go back to tethered.

 

UPDATE – Nov 2, 2017

The great people of Opentpcast are working to solve some of these issues, I understand that the microphone is now working, and the camera very soon (for owners of the newer batches of Vives). Also they managed to improve tracking and reduce jitters. Check it out here: https://github.com/OpenTPCast/Docs

In addition, I’ve seen unconfirmed reports that the US version of TPCast eliminates the annoying transmitter whine. I’ll update this post when more information becomes available.

 

UPDATE – Dec 21, 2017

My TPCast sat in a box for a few weeks, until I decided I want to use it in an upcoming demo. I tried on two different laptops and couldn’t get it to work. Swapped cables, used display port instead of HDMI, bypassed the linkbox, tried older versions of TPCast assistant, talked to support, nothing helped. Eventually I gave up and installed OpenTPCast. And it worked. Just like that. I setup two vives in the same space, using a single pair of basestations, and did a direct comparison between a tethered vive and a TPCast’ed vive. My thoughts:

  • Going untethered is still exhilarating
  • Tracking is noticeably affected, for the worse. Occasional visible jitters, jumps and pauses.
  • Display sporadically flickers in and out of low-resolution / high-compression mode. Visible compression artifacts will appear for a fraction of a second.
  • OpenTPCast uses VirualHere for the USB-over-wifi. VirtualHere requires a license, which is $25 USD.
  • Green and blue bars at the edge of vision are still there.
  • It made me sick! I’m not sure if it’s the jitters or some kind of lag that is visually unnoticeable and affected me over time. After walking around for a few minutes I started feeling a hint of queasy, which has not happened to me in VR for a long while.

UPDATE – Jan 2, 2017

I decided to throw a belated New Year VR party for a dozen people, and to use TPCast while I’m at it. Here are my impressions from this event:

  • It took an extra 20 minutes to setup the TPCast system, on top of the usual Vive setup. I found that the only way to get it working reliability is to enable the USB drivers manually, one by one (in VirtualHere). At some point it was almost working but then Windows Audio service crashed and I had to reboot and start over. This is probably related to the USB audio driver that VirtualHere adds to the system.
  • People had a blast playing untethered. Gorn and Space Pirate Trainer were the crowd favorites. My own game – Orbital Injection – was great for first-timers.
  • Tracking was a bit spotty, occasional jitter and pause, but nothing game breaking and nobody got sick (I was worried about that).
  • Display was perfect, no artifacts this time
  • Every 30 minutes or so, the TPCast wifi module dies and tracking stops completely. The only fix is a hard restart (unplug battery) and that also requires restarting SteamVR and sometimes VirualHere as well. It takes a few minutes of futzing around on the computer and it’s really annoying.
  • I would rather have 2 or 3 small batteries than a single giant battery. Sadly, there is only one battery that fits into the TPCast right now, and it’s massive.

Conclusion: Fun for events, when it works and you are willing to put in the extra setup time and troubleshooting. The device is not reliable enough for commercial use or businesses.

 

 

Microsoft Motion Controllers – Unboxing and first look

Hooray, I just got my Microsoft Motion Controllers, to pair with my Acer “MixedReality” headset (which can’t actually do MR in any way). See my review here and here.

So, Are they any good?

Short Answer

Tracking is superb. Build quality is horrible.

Long Answer

I received two controllers in a plain cardboard box devoid or any marking, text or symbols.

IMG_20171010_152013.jpg

The box contains two controllers and two pairs of AA batteries, and nothing else.
img_20171010_152042.jpg

Put in the batteries, hold the Windows button to turn on, and pesto. The first surprise. It uses white visible light LED. For some reason I thought these would be IR.

IMG_20171010_152334.jpg

 

Moving on to the software, running Mixed Reality Portal greeted me with this screen:

Capture.PNG

After a bit of digging around I found that the battery compartment hides a secret button, holding it down for a few seconds puts in the controller into pairing mode. You’ll need a Bluetooth capable computer (dongle not included). What isn’t documented anywhere is that the pass code is 0000. Yeah, I guessed it. On my third attempt.

IMG_20171010_153436.jpg

The Good Part

The tracking is surprisingly good. It’s really spot-on, HTC vive quality, as long as the controller is within view of the headset cameras. The FOV is 180 degrees horizontal but only about 100 degrees vertical. In fact, when moving up, the controller will sometimes lose tracking while it’s still in the headset virtual view (because the cameras are tilted down).

When the controller is outside the camera view, it will freeze in position but will still track orientation.

There is a slight lag in this video, but that lag is only in the screen display. The headset view is lag-free.

 

The Bad Part

The build quality is.. how to put it gently.. cheap-plasticy-chinese-knockoff kind of quality. Here, watch this:

 

Trigger: It’s binary but offers no click, no resistance, no feedback when you press it. It feels like a $5 RC car controller

Touchpad: Up, down, left and right all have a different “clickiness” to them. Different sound, different feel, different pressure required to trigger. UP is very “clicky”, DOWN barely registers.

Thumb stick: Movement is OK, but also has a cheap feel to it. Clicking on it generates this nasty grinding sensation (clearly audible in the video). Both of my controllers exhibit this problem, but one is much more pronounced than the other. It really is horrible.

Buttons: The main “windows” button and the side button are both fine, but the smaller menu button doesn’t register any click sensation or noise. It also doesn’t do anything in the Mixed Reality Portal. I’m not sure if it’s just broken (on both controllers), or if it’s designed to be a “silent” button, or what. Actually, I don’t care. It’s terrible as well.

Conclusion

I was genuinely surprised how well the controllers track in space, and the headset tracking has also improved since I last tried it. It seems the latest Windows update has refined the tracking algorithms. When moving the controller sideways and down you can to reach pretty far before it loses tracking. Moving it up is problematic, as it goes out of camera view before it even leaves your virtual view.

The build quality is well below the standard I was expecting, and on par with cheap knockoff XBOX controllers.

About the author

Shachar “Vice” Weis is a software developer freelancer and the CTO / Founder of Packet39.com, a software house that builds custom VR applications for the manufacturing and power industry.

How to troll Google Maps on a national level

If you happen to look at Israel on Google maps, on Yom Kippur, you would notice something interesting. The entire country is red with traffic. These are no ordinary traffic jams.

Capture.PNG

So what is going on? Something weird happens on Yum Kippur in Israel. For 24 hours, from dusk to dusk, nobody drives. The religious and atheists alike take a break from their cars. The roads and highways are empty and everyone takes to the streets. You’ll see kids biking on deserted highways, people just hanging out in the middle of an intersection. It feels post-apocalyptic and it’s amazing. The silence hits you like a wall.

Image result for israel yom kippur night picture

 

Of course many of these people have cellphones, and none of them are moving very fast, hence, Google thinks it’s a country-spanning traffic jam. And that’s how you troll Google maps in style. I wish every country would adopt this tradition.

Hell, I would make it a monthly event.