Tag related to Augmented Reality

Posts

Workshop – I: How would you design an AR marker (to minimize the computational cost of detection and tracking)

Our `Workshop’ series tries to revisit some basic – or not so basic – concepts of AR/VR. We will attempt to keep this fairly frequent and try to answer genuine questions that crop up – mostly asked by students and researchers when they first start working on these fields. We are pretty sure this will turn out to be an exercise for us as well! An opportunity to go back to the drawing board, the papers and the references of each aspect.

So, without any further delay, our first workshop is about… the AR Marker!

Most people involved in AR have come across AR markers one way or the other. Nowadays, many SDKs, applications and efforts from AR enthusiast are based on ARToolkit from HIT Lab, which if we are not hugely mistaken has made the concept widely known. In our humble opinion one (if not the…) most important contributions in AR, particularly in its currently popular, mostly-handheld and often `gimmicky’ incarnation.  But how does one create one of those markers and what effect has their design in tracking performance?

AR Markers need to obey a series of rules in order to be easily detectable by even the most basic of libraries. Border, background and marker image arrangement follow some rules, to make detection and tracking easier.

To begin with, markers must be square. The border is used to detect the marker in the first place and needs to be of contrasting colour to the background – hence most markers are black and white. The border is usually twice as big as the marker image with a thickness about 1/4 of the marker.

Fig.1 An example of an AR Marker

The marker image itself must not be rotationally symmetric, for the algorithm needs to be able to detect orientation. Increased complexity of the image affects the distance from which the AR marker is trackable. Marker images are usually black on white background but can be coloured as well. ARToolkit has provisioning for faster tracking when the marker image is in colour. The `Pro’ (4.4) version of ARToolkit uses a 2D bark-code in the marker image to enhance tracking performance when many markers are used at the same time. Too-fine detail in the marker image, when used to differentiate markets, can cause problems as algorithms generally sample the marker at a resolution of 16×16 pixels.  Last but not least it is advisable to use the same camera to do calibration and tracking.

References:

[1] ARToolkit –  http://www.hitl.washington.edu/artoolkit/
[2] ARToolworks – http://www.artoolworks.com/

A look at Kinect Fusion and Lightspace from Microsoft labs

Nice video from The Verge – with colorful comments bellow – on work being done in Microsoft Labs on Kinect and Lightspace. We hope to experiment with Kinect after I am done with the bulk of work for Project IVY

A look at Kinect Fusion and Lightspace from Microsoft labs (source:TheVerge)

AR April recap

We may have been silent for April with unrelated to work obstacles but we are back on track. Here is a summary of some noteworthy things we came across this month in inverse chronological order.

An exploration of UX for AR

Synthetic Toys were invited to express our opinion on User Experience for Augmented Reality, in the new blog AR-UX, by @robman of MOB-labs, following our position paper on AR Standards Meeting in Barcelona, this February.

March biweekly on AR – Part I

The AR field is franticaly producing news. People seem to like the ‘buzz’ around the concept, developers and companies constantly introduce products and services, while the public seem to show lots of interest about what is out there.

In an effort to keep up with the news we will try to post a summary of noteworthy subjects, bi-weekly. And here is the first bunch…