About Klaus Petersen

I like to create magical things, especially projects related to new technologies like augmented and virtual reality, mobile robotics and MEMS-based sensor networks. I code in C(++) and Python, trying to keep up with my very talented colleagues :-)

High-performance Use Cases of LPVR & Varjo Headsets

Components of a VR/AR Operating System

Augmented and virtual reality technology helps boost worker productivity in various fields such as automotive, aerospace, industrial production, and more. Whereas the context of these applications is usually fairly specific, some aspects are common to many of these use cases. In this article, we will specifically explore the topic of pose tracking of Varjo head mounted displays (HMDs) based on LP-RESEARCH’s LPVR operating system. We will further on show two customer use cases that utilize LPVR in different ways.

In a typical VR/AR setup, you find three main subsystems as shown in the illustration below:

With our LPVR operating system, we connect these three building blocks of an VR/AR system and make them communicate seamlessly with each other while providing a simple, unified interface to the user. Depending on the specific use case, users might select different types of hardware to build their VR/AR setup. Therefore LPVR offers a wide range of interface options to adapt to systems from various manufacturers.

LPVR Flavors

LPVR operates in different flavors, we can group end applications into two categories:

  • LPVR-CAD – Static AR/VR setups, where multiple users operate and collaborate in one or more joint tracking volumes. These tracking volumes can be situated in different locations.
  • LPVR-DUO – AR/VR systems that are located in a vehicle or on a motion platform: such systems have special requirements, especially on the tracking side. If, for example, you would want to track a headset inside a car, displaying a virtual cockpit anchored to the car frame, and a virtual outside world fixed to a global coordinate system, means of locating the car in the world and referencing the HMD locally in the car frame are required.

 

In the following paragraphs, we will introduce two customer use cases that cover these two basic scenarios.

Large-scale Industrial Design at Hyundai

– Varjo XR-3 at Hyundai Design Center with optical markers attached. Image credit: Hyundai

For the Korean automotive company Hyundai Motor Corporation, we created a large, location-based virtual reality installation at their research and development center in Namyang, Korea. The system is used to showcase, amend and modify prototype and production-ready automobile designs.

This application uses optical outside-in tracking and LP-RESEARCH’s LPVR-CAD solution to track up to 20 users wearing head-mounted displays. While LPVR allows a mix of different headset types to operate in the same tracking volume, the Varjo XR-3 gives the most outstanding performance to inspect objects in high resolution and great detail. Additionally to an HMD, users carry hand controllers for a total of more than 40 tracked objects in a space close to 400 sqm.

– Hyundai’s collaborative virtual reality design experience. Image credit: Hyundai

Responsiveness is achieved by using LPVR-CAD to combine data from the inertial measurement unit built into the headsets and information from the optical tracking system. The optical system uses 36 infrared cameras to track the 160 markers attached to the HMDs and hand controllers. Perfectly smooth and uninterrupted position and orientation data of each user’s HMD is achieved by using LP-RESEARCH’s sensor fusion algorithms.

Depending on the type of headset, users either wear a backpack PC, connect to a host wirelessly or use an extra-long cable to connect directly to a rendering PC outside the tracking volume.

“Currently, we are actively utilizing VR from the initial development stage to the point of development. In the future, we plan to increase accessibility and usability by simplifying equipment using wireless HMDs. For this, improving the speed and stability of wireless internet is essential, which we plan to address by introducing 5G. In addition, LP RESEARCH’s technology is essential for multi-user location sharing within a virtual space.” – SungMook Kang, Visualization Specialist, Hyundai Motor Company

Next-level Automotive Entertainment with CUPRA

Imagine.. playing Mario cart, your hands are gripping the wheel, and you are in Neo Tokyo, on a race track. Futuristic buildings keep flying by while you race ahead, drifting into long turns and leaving your competitors behind you.

Now imagine you are no longer in your living room, you are sitting in an actual race car, buzzing around in an empty parking lot. Instead of looking through the windshield with your own eyes, you are wearing a Varjo XR-3 HMD. What you see outside the car is a virtual world, it’s Neo Tokyo.

– The view through the Varjo XR-3 headset. Image credit: CUPRA

As the car moves on the parking lot, you move inside the virtual world. When you move your head inside the car’s cockpit, the motions of your head are accurately tracked.

– Varjo XR-3 inside the cabin of the Urban Rebel. Image credit: Fonk Magazine

– Cupra’s Urban Rebel drifting on the test course

Together with the Norwegian company Breach VR, we have implemented this experience for the automotive company CUPRA. CUPRA is relentlessly pushing the technology of their vehicles into the future, striving to provide a novel driving experience to their customers.

Tracking of the vehicle and the Varjo XR-3 inside the vehicle is achieved with LP-RESEARCH’s automotive tracking systems LPVR-DUO. As the headset’s gyroscope sensors record the superimposed motion data of the car and the user inside the car, a specialized sensing system, and the algorithm are required to separate the two.

The result of this cascade of exceptional technology is a compellingly immersive driving experience of the future. The combination of an outstanding visualization device like the Varjo XR-3, LPVR state-of-the-art tracking, BreachVR’s 3D software and design and, last but not least, the incredible CUPRA race cars make for an exciting ride that you’ll greatly enjoy and never forget. Come and join the ride!

Check this blog blog post in the Varjo Insider Blog.

Check out our Instagram for further use cases with Varjo’s HMDs: @lpresearchinc

Exploring Affective Computing Concepts

Introduction

Emotional computing isn’t a new field of research. For decades computer scientists have worked on modelling, measuring and actuating human emotion. The goal of doing this accurately and predictively has so far been elusive.

pngwing.com

In the past years we have worked with the company Qualcomm to create intellectual property related to this topic, in the context of health care and the automotive space. Even though this project is pretty off-topic from our ususal focus areas it is an interesting sidetrack that I think is worth posting about.

Affective Computing Concepts

As part of the program we have worked on various ideas ranging from relatively simple sensory devices to complete affective control systems to control the emotional state of a user. Two examples of these approaches to emotional computing are shown below.

The Skin Color Sensor measures the color of the facial complexion of a user, with the goal of estimating aspects of the emotional state of the person from this data. The sensor is to have the shape of a small, unobtrusive patch to be attached to a spot on the forehead of the user.

Another affective computing concept we have worked on is the Affectactic Engine. A little device that measures the emotional state of a user via an electromyography sensor and accelerometer. Simply speaking we are imagining that high muscle tension and certain motion patterns correspond to a stressed emotional state of the user or represent a “twitch” a user might have.

The user is to be reminded of entering this “stressed” emotional state by vibrations emitted from the device. The device is to be attached to the body of the user by a wrist band, with the goal of reminding the user of certain subconscious stress states.

Patents

In the course of this collaboration we created several groundbreaking patents in the area of affective computing:

Design of an Efficient CAN-Bus Network with LPMS-IG1

Introduction to Designing an Efficient CAN-Bus Network

This article describes how to design an efficient high speed CAN-bus network with LPMS-IG1. We offer several sensor types with a CAN bus connection. The CAN bus is a popular network standard for applications like automotive, aerospace and industrial automation where connecting a large number of sensor and actuation units with a limited amount of cabling is required.

While creating a CAN bus network is not difficult by itself, there are a few key aspects that an engineer should follow in order to achieve optimum performance.

Efficient CAN-Bus Network Topology

A common mistake when designing a CAN bus network is to use a star topology to connect devices to each other. In this topology the signal from each device is routed to a center piece by connections of similar length. The center piece is connected to the host to acquire and distribute data to the devices of the network.

For reaching the full performance of a CAN bus network, we strongly discourage using this topology. Most CAN bus setups designed in this way will fail to work reliably and at high speed.

The CAN bus standard’s fundamental concept is to work best in a daisy chain configuration, with one sensor unit or the data acquisition host being the first device in the chain and one device being the last in the network.

Maximum CAN-Bus Speed and Cable Length

A key aspect for the design of an efficient high speed CAN-bus network is to correctly adjust bus cable lengths. The bus line running past each device is to be the longest connection in the network. Each sensor needs to be connected to the bus by a short stub connection. A typical length for such a stub connection is 10-30cm, whereas the main bus line can have a length of hundreds of meters, depending on the desired transmission speed.

Speed in bit/s Maximum Cable Length
1 Mbit/s 20 m
800 kbit/s 40 m
500 kbit/s 100 m
250 kbit/s 250 m
125 kbit/s 500 m

Note that a CAN bus network needs to be terminated using a 120 Ohm resistor at each end. This is especially important for bus length of more than 1-2m and should be considered as general good practice.

LPMS-IG1 CAN-Bus Configuration

One of our products with a CAN bus interface option is our LPMS-IG1 high performance inertial measurement unit. LPMS-IG1 can be flexibly configured to satisfy user requirements. It has the ability to output data using the CANopen standard, freely configurable sequential streaming or our proprietary binary format LP-BUS. These and further parameters can be set via our IG1-Control data acquisition application.

Some CAN bus data loggers that rely on the CANopen standard require users to provide an EDS file to automatically configure each device on the network. While we don’t support the automatic generation of EDS files from our data acquisition applications, depending on the settings in IG1-Control or LPMS-Control, it is possible to manually create an EDS file as described in this tutorial.

In this article we give a few essential insights into how to design an efficient high speed CAN-bus network with LPMS-IG1. If you would like to know more about this topic or have any questions, let us know!

LPVR-DUO Featured at Unity for Industry Japan Conference

Unity for Industry Conference – XRは次のステージへ

LPVR-DUO has been featured at the Unity for Industry online conference in Japan. TOYOTA project manager Koichi Kayano introduced LPVR-DUO with Varjo XR-1 and ART Smarttrack 3 for in-car augmented reality (see the slide above).

Besides explaining the fundamental functional principle of LPVR-DUO inside a moving vehicle – using a fusion of HMD IMU data, vehicle-fixed inertial measurements and outside-in optical tracking information – Mr. Kayano presented videos of content for a potential end-user application:

Based on a heads-up display-like visualization, TOYOTA’s implementation shows navigation and speed information to the driver. The images below show two driving situations with a virtual dashboard augmentation overlay.

AR Head-Mounted Display vs. Heads-Up Display

This use-case leads us to a discussions of the differences between an HMD-based visualization solution and a heads-up display (HUD) that is e.g. fixed stationary to the top of a car’s console. While putting on a head mounted display does require a minor additional effort by the driver, there are several advantages of using a wearable device in this scenario.

Content can be displayed at any location in the car, from projecting content onto the dashboard, the middle console, the side windows etc. A heads-up display works only in one specific spot.

As the HMD shows information separately to the left and right eye of the driver, we can display three-dimensional images. This allows for accurate placement of objects in 3D space. The correct positioning within the field of view of the driver is essential for safety relevant data. In case of a hazardous situation detected by a car’s sensor array the driver will know exactly where the danger is occurring from.

These are just two of many aspects that set HMD-based augmented reality apart from a heads-up display. The fact that large corporations like TOYOTA are starting to investigate this specific topic shows that the application of augmented reality in the car will be an important feature for the future of mobility.

NOTE: Image contents courtesy of TOYOTA Motor Corporation.

See-through Display First Look – LPVIZ (Part 3)

Virtual Dashboard Demonstration

This is a follow-up post to the introduction of our in-vehicle AR head mounted display LPVIZ part 1 and part 2.

To test LPVIZ we created a simple demo scenario of an automotive virtual dashboard. We created a Unity scene with graphic elements commonly found on a vehicle dashboard. We animated these elements to make the scene look more realistic.

This setup is meant for static testing at our shop. For further experiments inside a moving vehicle we are planning to connect the animated elements directly to car data (speed etc.) communicated over the CAN bus.

The virtual dashboard is only a very simple example to show the basic functionality of LPVIZ. As described in a previous post, many a lot more sophisticated applications can be implemented.

The video above was taken through the right eye optical waveguide display of LPVIZ. We took this photo with a regular smartphone camera and therefore it is not very high quality. Nevertheless, it confirms that the display is working and correctly shows the virtual dashboard.

The user is looking at the object straight ahead. In case the user rotates his head or changes position, his view of the object will change perspectively. An important point to mention is the high luminosity of the display. We took this photo with the interior lighting in our shop turned on normally, and without any additional shade in front of the display.

1 2 3 9