mounted display - OSA Publishing

5 downloads 0 Views 2MB Size Report
Abstract: An optical see-through head-mounted display (OST-HMD), which enables optical superposition of digital information onto the direct view of the physical ...
A 3D integral imaging optical see-through headmounted display Hong Hua,1,* and Bahram Javidi2 2

1 3DVIS Lab, College of Optical Sciences, University of Arizona, Tucson, AZ 85721, USA Department of Electrical and Computer Engineering, University of Connecticut, Storrs, CT 06269, USA *[email protected]

Abstract: An optical see-through head-mounted display (OST-HMD), which enables optical superposition of digital information onto the direct view of the physical world and maintains see-through vision to the real world, is a vital component in an augmented reality (AR) system. A key limitation of the state-of-the-art OST-HMD technology is the well-known accommodation-convergence mismatch problem caused by the fact that the image source in most of the existing AR displays is a 2D flat surface located at a fixed distance from the eye. In this paper, we present an innovative approach to OST-HMD designs by combining the recent advancement of freeform optical technology and microscopic integral imaging (micro-InI) method. A micro-InI unit creates a 3D image source for HMD viewing optics, instead of a typical 2D display surface, by reconstructing a miniature 3D scene from a large number of perspective images of the scene. By taking advantage of the emerging freeform optical technology, our approach will result in compact, lightweight, goggle-style AR display that is potentially less vulnerable to the accommodationconvergence discrepancy problem and visual fatigue. A proof-of-concept prototype system is demonstrated, which offers a goggle-like compact form factor, non-obstructive see-through field of view, and true 3D virtual display. © 2014 Optical Society of America OCIS codes: (120.2040) Displays; (120.2820) Heads-up displays; (120.4570) Optical design of instruments; (330.7338) Visually coupled optical systems; (110.6880) Three-dimensional image acquisition.

References and links 1.

R. Azuma, Y. Baillot, R. Behringer, S. Feiber, S. Julier, and B. MacIntyre, “Recent advances in augmented reality,” IEEE Comput. Graph. Appl. 21(6), 34–47 (2001). 2. F. Zhou, H. B.-L. Duh, and M. Billinghurst, “Trends in augmented reality tracking, interaction and display: a review of ten years of ISMAR,” Proc. of 7th IEEE/ACM International Symposium on Mixed and Augmented Reality 193–202 (2008). 3. http://www.google.com/glass/start/ 4. S. Yano, M. Emoto, T. Mitsuhashi, and H. Thwaites, “A study of visual fatigue and visual comfort for 3D HDTV/HDTV images,” Displays 23(4), 191–201 (2002). 5. S. J. Watt, K. Akeley, M. O. Ernst, and M. S. Banks, “Focus Cues Affect Perceived Depth,” J. Vis. 5(10), 834– 862 (2005). 6. D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-Accommodation Conflicts Hinder Visual Performance and Cause Visual Fatigue,” J. Vis. 8(3), 33 (2008). 7. G. Lippmann, “Epreuves reversibles donnant la sensation du relief,” J. Phys. (Paris) 7, 821–825 (1908). 8. J. S. Jang and B. Javidi, “Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes,” Opt. Lett. 28(20), 1924–1926 (2003). 9. M. Martínez-Corral, H. Navarro, R. Martínez-Cuenca, G. Saavedra, and B. Javidi, “Full parallax 3-D TV with programmable display parameters,” Opt. Photon. News 22(12), 50 (2011). 10. X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52(4), 546–560 (2013). 11. C. W. Chen, M. Cho, Y. P. Huang, and B. Javidi, “Improved viewing zones for projection type integral imaging 3D display using adaptive liquid crystal prism array,” IEEE J. Disp. Technol. 10(3), 198–203 (2014).

#207455 - $15.00 USD (C) 2014 OSA

Received 4 Mar 2014; revised 30 Apr 2014; accepted 1 May 2014; published 28 May 2014 2 June 2014 | Vol. 22, No. 11 | DOI:10.1364/OE.22.013484 | OPTICS EXPRESS 13484

12. J. Arai, M. Kawakita, T. Yamashita, H. Sasaki, M. Miura, H. Hiura, M. Okui, and F. Okano, “Integral threedimensional television with video system using pixel-offset method,” Opt. Express 21(3), 3474–3485 (2013). 13. H. Sasaki, K. Yamamoto, Y. Ichihashi, and T. Senoh, “Image size scalable full-parallax coloured threedimensional video by electronic holography,” Sci. Rep. 4, 4000 (2014). 14. A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “Rendering for an interactive 360° light field display,” ACM Transactions on Graphics (TOG) –Proceedings of ACM SIGGRAPH 2007 26(3), (2007). 15. Y. Takaki, Y. Urano, S. Kashiwada, H. Ando, and K. Nakamura, “Super multi-view windshield display for longdistance image information presentation,” Opt. Express 19(2), 704–716 (2011). 16. B. G. Blundell and A. J. Schwarz, “The classification of volumetric display systems: characteristics and predictability of the image space,” IEEE Trans. Vis. Comput. Graph. 8(1), 66–75 (2002). 17. S. Liu, H. Hua, and D. Cheng, “A novel prototype for an optical see-through head-mounted display with addressable focus cues,” IEEE Trans. Vis. Comput. Graph. 16(3), 381–393 (2010). 18. S. Liu and H. Hua, “A systematic method for designing depth-fused multi-focal plane three-dimensional displays,” Opt. Express 18(11), 11562–11573 (2010). 19. X. Hu and H. Hua, “Design and assessment of a depth-fused multi-focal-plane display prototype,” J. Disp. Technol. 10(4), 308–316 (2014). 20. P.-A. Blanche, A. Bablumian, R. Voorakaranam, C. Christenson, W. Lin, T. Gu, D. Flores, P. Wang, W.-Y. Hsieh, M. Kathaperumal, B. Rachwal, O. Siddiqui, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “Holographic three-dimensional telepresence using large-area photorefractive polymer,” Nature 468(7320), 80–83 (2010). 21. H. Mukawa, K. Akutsu, I. Matsumura, S. Nakano, T. Yoshida, M. Kuwahara, and K. Aiki, “A full-color eyewear display using planar waveguides with reflection volume holograms,” J. Soc. Inf. Disp. 17(3), 185–193 (2009). 22. http://www.lumus-optical.com/ 23. S. Yamazaki, K. Inoguchi, Y. Saito, H. Morishima, and N. Taniguchi, “Thin widefield-of-view HMD with freeform-surface prism and applications,” Proc. SPIE 3639, 453–462 (1999). 24. D. Cheng, Y. Wang, H. Hua, and M. M. Talha, “Design of an optical see-through head-mounted display with a low f-number and large field of view using a freeform prism,” Appl. Opt. 48(14), 2655–2668 (2009). 25. D. Cheng, Y. Wang, H. Hua, and J. Sasian, “Design of a wide-angle, lightweight head-mounted display using free-form optics tiling,” Opt. Lett. 36(11), 2098–2100 (2011). 26. H. Hua, X. Hu, and C. Gao, “A high-resolution optical see-through head-mounted display with eyetracking capability,” Opt. Express 21(25), 30993–30998 (2013). 27. http://www.innovega-inc.com 28. A. Maimone and H. Fuchs, “Computational augmented reality eyeglasses,” Proc. of 2013 International Symposium on Mixed and Augmented Reality (ISMAR), 29–38(2013).

1. Introduction An augmented reality (AR) display, which allows overlaying of 2D or 3D digital information on a person’s real-world view, has long been portrayed as a transformative technology to redefine the way we perceive and interact with digital information [1,2]. For example, in medicine AR technology may enable a physician to see CT images superimposed onto the patient’s abdomen while performing surgery; in mobile computing it may allow a tourist to access reviews of restaurants in his or her sight while walking on the street; in military training it may allow warfighters to be effectively trained in environments that blend 3D virtual objects into live training environments. Despite these promises, AR technology has yet to be fully embraced and practically used in most application fields. In the state of the art, the most critical barriers of AR technology are defined by the displays. A desired form of AR displays is a lightweight optical see-through head-mounted display (OST-HMD), which enables optical superposition of digital information onto the direct view of the physical world and optically maintains see-through vision to the real world. Along with the rapidly increased bandwidth of wireless networks, the miniaturization of electronics and optoelectronics, and the prevailing cloud computing, in recent years a significant research and market drive has been toward developing an unobtrusive AR display that integrates the functions of OST-HMDs, smart phones, and mobile computing within the volume of a pair of eyeglasses. A few promising commercial OST-HMDs demonstrated very compact and lightweight form factors and the high potential of widespread public use, resulting in significant advances in OST-HMDs. For instance, the Google Glass [3] is a very compact, lightweight (~36grams), monocular OST-HMD, providing the benefits of encumbrance-free instant access to digital information. Although it has demonstrated promising and exciting future prospect of AR displays, the current version of Google Glass offers a very narrow FOV #207455 - $15.00 USD (C) 2014 OSA

Received 4 Mar 2014; revised 30 Apr 2014; accepted 1 May 2014; published 28 May 2014 2 June 2014 | Vol. 22, No. 11 | DOI:10.1364/OE.22.013484 | OPTICS EXPRESS 13485

(about 15° FOV diagonally) with an image resolution of 640x360 pixels. A compact and lowcost OST-HMD with a much wider FOV and higher resolution is desired to effectively augment the real-world view in many applications and to fully exploit the full range of benefits afforded by AR technologies. Importantly, minimizing visual discomfort involved in wearing AR displays remains to be an unresolved challenge. One of the key factors causing visual discomfort is the accommodation-convergence discrepancy between the displayed digital information and the real-world scene, which stems from the fact that the image source in most of the existing AR displays is a 2D flat surface located at a fixed distance from the eye. Consequently, this type of AR displays lacks the ability to render correct focus cues, including accommodation and retinal blurring, for digital information that is to be overlaid over real objects located at distances other than the 2D image source. It causes three aspects of accommodationconvergence conflicts. Firstly, as shown in Fig. 1(a), there exists a mismatch of accommodation cues between the 2D image plane and the real-world scene. The eye is cued to accommodate at the 2D image plane for viewing the augmented information while the eye is concurrently cued to accommodate and converge at the depth of a real 3D object onto which the digital information is overlaid. The distance gap between the display plane and realworld objects can be easily beyond what the human visual system (HVS) can accommodate simultaneously. A simple example is the use of an AR display for driving assistance where the eyes need to constantly switch attention between the AR display and real-world objects spanning from near (e.g. dashboard) to far (e.g. road signs). Secondly, in a binocular stereoscopic display, by rendering a pair of stereoscopic images with binocular disparities, the augmented information may be rendered to be at a different distance from the 2D display surface (Fig. 1(b)). When viewing the augmented 3D information, the eye is cued to accommodate at the 2D display surface to bring the digital information in focus but at the same time the eye is forced to converge at the depth dictated by the binocular disparity to fuse the stereoscopic pair. In viewing a natural scene (Fig. 1(c)), the eye convergence depth coincides with the accommodation depth and objects at depths other than the object of interest are seen blurred. Finally, synthetic objects rendered via stereoscopic images, regardless of their rendered distance from the user, are seen all in focus if the viewer focuses on the image plane, or are seen all blurred if the user accommodate at distances other than the image plane (Fig. 1(b)). The retinal image blur of displayed scene does not vary with the distances from an eye fixation point to other points at different depths in the simulated scene. Many psychophysical studies have investigated the adverse consequences of the incorrectly rendered focus cues in stereoscopic displays [4–6]. In a nutshell, the incorrect focus cues may contribute to the commonly recognized issues in viewing stereoscopic displays: such as distorted depth perception, diplopic vision, visual discomfort and fatigue, and degradation in oculomotor response. In this paper, we describe a novel OST-HMD design by combining emerging freeform optical technology and microscopic integral imaging (micro-InI) method. A micro-InI unit reconstructs a miniature 3D scene through a large number of perspective images of a 3D scene. The reconstructed 3D scene creates a 3D image source for HMD viewing optics, which enables the replacement of a typical 2D display surface with a 3D source and thus potentially overcomes the accommodation-convergence discrepancy problem. By taking advantage of the emerging freeform optical technology, our approach will result in compact, lightweight, goggle-style AR display that is potentially less vulnerable to the accommodation-convergence discrepancy problem and visual fatigue. We present experiments and demonstrate a proof-ofconcept prototype system, which offers a goggle-like compact form factor, non-obstructive see-through field of view, and true 3D virtual display.

#207455 - $15.00 USD (C) 2014 OSA

Received 4 Mar 2014; revised 30 Apr 2014; accepted 1 May 2014; published 28 May 2014 2 June 2014 | Vol. 22, No. 11 | DOI:10.1364/OE.22.013484 | OPTICS EXPRESS 13486

Fig. 1. Illustration of accommodation-convergence cues in (a) monocular AR display; (b) stereoscopic viewing in a binocular display; and (c) viewing a real object.

2. Optical design of InI-HMD system The key challenges of creating a lightweight and compact OST-HMD solution, invulnerable to the accommodation-convergence discrepancy problem, are to address two cornerstone issues. The first is to provide the capability of displaying 3D scenes with correctly rendered focus cues for its intended distance correlated with the eye convergence depth in an AR display, rather than on a fixed-distance 2D plane. The second is to create an optical design of an eyepiece optics with a form factor as compelling as a pair of eyeglasses. In terms of 3D display methods, several non-stereoscopic display methods, including integral imaging [7–13], lightfield [14], super-multi view (SMV) [15], volumetric [16], multifocal plane display methods [17–19], and holographic displays [20], are potentially able to overcome the accommodation-convergence discrepancy problem with different levels of limitations. Among these methods, an InI-based display method allows the reconstruction of the full-parallax lightfields of a 3D scene appearing to be emitted by a 3D scene seen from constrained or unconstrained viewing zones. Compared with all other techniques, an InI technique requires the least amount of hardware complexity, which makes it possible to integrate it with OST-HMD optical system and create a wearable true 3D AR display. In terms of eyepiece optics design, several optical technologies have been explored for OST-HMD designs with the ultimate goal of achieving eyeglass-like wearable displays,

#207455 - $15.00 USD (C) 2014 OSA

Received 4 Mar 2014; revised 30 Apr 2014; accepted 1 May 2014; published 28 May 2014 2 June 2014 | Vol. 22, No. 11 | DOI:10.1364/OE.22.013484 | OPTICS EXPRESS 13487

including holographic optical elements (HOE) [21], reflective waveguides or light guide [22], freeform optics [23–26], contact lens display [27], and computational multi-layer design [28]. Among the many different methods, the emerging freeform optical technology demonstrates great promise in designing compact HMD systems [23–26]. To address the two key issues stated above we propose a novel optical scheme that integrates a 3D microscopic InI method for full-parallax 3D scene visualization with the emerging freeform optical technology for OST-HMD eyepiece optics. This approach enables a compact 3D integral imaging optical see-through HMD (InI-OST-HMD) with full-parallax lightfield rendering capability. Figure 2(a) shows the schematic optical layout of an integrated InI-OST-HMD system based on our new approach. The optics consists of three key subsystems: a microscopic InI unit (micro-InI) reproducing the full-parallax lightfields of a 3D scene seen from constrained viewing zones, a freeform eyepiece optics relaying the reconstructed 3D lightfields into a viewer’s eye, and a see-through system optically enabling non-obtrusive view of the real world scene. The micro-InI unit, as schematically illustrated in Fig. 2(b), consists of a high-resolution micro-display and microlens array (MLA). A set of 2D elemental images, each representing a different perspective of a 3D scene, are displayed on a high-resolution microdisplay. Through the MLA, each elemental image works as a spatially-incoherent object and the conical ray bundles emitted by the pixels in the elemental images intersect and integrally create the perception of a 3D scene that appears to emit light and occupy the 3D space. Such an InI system allows the reconstruction of a 3D surface shape with parallax information in both horizontal and vertical directions. The lightfield of the reconstructed 3D scene (e.g. the curve AOB in Fig. 2(b)) will be directly coupled into eyepiece optics for viewing. To achieve a compact optical design of OSTHMD, a wedge-shaped freeform prism [23– 26] is adopted as the eyepiece, through which the 3D scene reconstructed by the micro-InI unit is magnified and viewed. As illustrated in Fig. 2(c), the prism eyepiece is formed by three freeform optical surfaces which are labeled as 1, 2, and 3, respectively. The exit pupil is where the eye is placed to view the magnified 3D scene. A light ray emitted from a 3D point (e.g. A) located at the intermediate scene is first refracted by the surface 3 next to the reference plane, followed by two consecutive reflections by the surfaces 1' and 2, and finally is transmitted through the surface 1 and reaches the exit pupil of the system. Multiple ray directions from the same object point (e.g. red, green, and blue rays from point A), each of which represents a different view of the object, impinge on different locations of the exit pupil and reconstruct a virtual 3D point (e.g. A’) in front of the eye. Rather than requiring multiple elements, the optical path is naturally folded within a three-surface prism structure, which helps reduce the overall volume and weight of the optics substantially when compared with designs using rotationally symmetric elements. To enable see-through capability for AR systems, the surface 2 of the prism in Fig. 2(c) is coated as a beamsplitting mirror and a freeform corrector lens (Fig. 2(a)), consisting of two freeform surfaces (2’ and 4), is attached to the surface 2 of the prism to correct the viewing axis deviation and undesirable aberrations introduced by the freeform prism to the real world scene. The rays from the virtual lightfield are reflected by the surface 2 of the prism while the rays from a real-world scene are transmitted through the freeform lens and prism. The front surface 2’ of the freeform lens matches the shape of the surface 2 of the prism. The back surface 4 is optimized to minimize the shift and distortion introduced to the rays from a realworld scene when the lens is combined with the prism. The additional corrector lens does not noticeably increase the footprints and weight of the overall system.

#207455 - $15.00 USD (C) 2014 OSA

Received 4 Mar 2014; revised 30 Apr 2014; accepted 1 May 2014; published 28 May 2014 2 June 2014 | Vol. 22, No. 11 | DOI:10.1364/OE.22.013484 | OPTICS EXPRESS 13488

Fig. 2. Schematic design of a 3D integral imaging optical see-through HMD using freeform optical technology: (a) Integrated optical layout and raytracing of the virtual display and seethrough paths; (b) Schematics of microscopic integral imaging unit, in which the reference plane corresponds to the image conjugate plane of the microdisplay image by the MLA; (c) Schematic raytracing for visualizing 3D lightfield through a freeform prism, in which the virtual reference plane corresponds to the image conjugate plane imaged by the eyepiece.

There exists several key differences between our InI-OSTHMD system and a conventional InI-based visualization method which has been primarily investigated for its use in eyewearfree 3D displays [7–13]. First, a microdisplay with large pixel counts and very fine pixels (e.g. ~5-10μm pixel size) is used in a micro-InI system instead of large-pixel display devices (~200-500μm pixel size) used in conventional InI displays, offering a significant gain in spatial resolution. Secondly, due to the nature of HMD systems, the viewing zone is well confined and therefore a much smaller view angle would be adequate to generate the fullparallax lightfields for the well-confined viewing zone than that required for large-size autostereoscopic displays. Thirdly, in a conventional InI-based display system, it is very challenging to achieve the visualization of a 3D scene offering a large field of view and a large depth of field due to the limited imaging capability and finite aperture of the MLAs, poor spatial resolution of large-size displays, and trade-off relationship between wide view angle and high lateral and longitudinal resolutions. In an InI-HMD system, due to the benefits of HMD viewing optics magnification, to produce a perceived 3D volume spanning a large depth range (e.g. 40cm to 5m), a very narrow depth range (e.g. ~3.5mm) is adequate for the intermediate 3D scene reconstructed by the micro-InI unit, which is much more technically affordable than in a conventional stand-alone InI display system requiring at least 50cm depth range to be usable. Finally, by optimizing the microlenses and the HMD viewing optics together, the depth resolution of the overall InI-HMD system can be substantially improved, overcoming the imaging limit of a stand-alone InI system.

#207455 - $15.00 USD (C) 2014 OSA

Received 4 Mar 2014; revised 30 Apr 2014; accepted 1 May 2014; published 28 May 2014 2 June 2014 | Vol. 22, No. 11 | DOI:10.1364/OE.22.013484 | OPTICS EXPRESS 13489

3. System prototype Based on the schematics in Fig. 2, we implemented a proof-of-concept monocular prototype of an InI OST-HMD using off-the-shelf optical components (Fig. 3(a)). An MLA of a 3.3mm focal length and 0.985mm pitch was utilized. The microdisplay is a 0.8” organic light emitting display. It offers 1920x1200 color pixels with a pixel size of 9.6μm. A freeform eyepiece with an equivalent focal length of 28mm, along with a see-through compensator, was utilized. Although the freeform eyepiece offers a field of view of 40 degrees, its combination with the micro-InI unit yields approximately 33.4 degrees field of view. Due to the limit of the original freeform eyepiece, the InI OST-HMD system offers approximately a 6.5mm exit pupil diameter in which the 3D InI view can be observed. The distance from the system exit pupil to the surface 1 of the freeform eyepiece is 19mm. For demonstration purpose, a 3D scene consisting of a number “3” in orange color and a letter “D” in blue was simulated. In the visual space, the objects “3” and “D” are located at ~4 meters and 30cms away from the eye position, respectively. To clearly demonstrate the effects of focusing, these character objects, instead of using plain solid colors, were rendered with black line textures. The dimensions of these character objects were chosen to maintain approximately the same angular size in the visual space. As a result, these objects will show approximately the same size in the captured images through a camera, regardless of their large depth separation. An array of 18x11 elemental images of the 3D scene were simulated, each of which consists of 102 by 102 color pixels. Due to the limit of the existing eyepiece, we only used the central 12x11 elemental images for prototype demonstration (Fig. 3(b)). The 3D scene reconstructed by the micro-InI unit is approximately 10mm away from the MLA and the separation of the two reconstructed targets is approximately 3.5mm in depth in the intermediate reconstruction space. Figures 3(c) through 3(f) shows a set of image captured with a digital camera placed at the eye position. To demonstrate the effects of focus and see-through view, in the real-world view, a Snellen letter chart and a printed black-white grating target were placed ~4 meters and 30cms away from the viewer, respectively, which correspond to the locations of the objects “3” and “D”, respectively. Figures 3(c) and 3(d) demonstrate the effects of focusing the camera on the Snellen chart and the grating target, respectively. The object “3” appears to be in sharp focus when the camera was focused on the far Snellen chart while the object “D” was in focus when the camera was focused on the near grating target. Figures 3(e) and 3(f) demonstrate the effects of shifting the camera position from left to the right sides of the eyebox while the camera focus was set on the near grating target. As expected, slight perspective change was observed between these two views. Although artifacts admittedly are visible and further development is needed, the results clearly demonstrated that the proposed method for AR display can produce correct focus cues and true 3D viewing in a large depth range.

#207455 - $15.00 USD (C) 2014 OSA

Received 4 Mar 2014; revised 30 Apr 2014; accepted 1 May 2014; published 28 May 2014 2 June 2014 | Vol. 22, No. 11 | DOI:10.1364/OE.22.013484 | OPTICS EXPRESS 13490

Fig. 3. Prototype demonstration and experimental results for the proposed InI OST-HMD system: (a) Setup; (b) Elemental images; (c) and (d) Reconstructed image captured by the camera focused at 4m and 30cm, respectively; (e) and (f) Reconstructed image captured by the camera shifted to left and right side of the exit pupil.

4. Conclusion In conclusion, we have described a novel method for designing an OST-HMD which uniquely integrates the recent advancement of freeform optical technology and microscopic integral imaging (micro-InI) method. This new method potentially leads to wearable AR displays that are less vulnerable to the accommodation-convergence discrepancy problem and visual discomfort in existing OST-HMD systems. The optical principles of the proposed method were presented and a proof-of-concept prototype offering a field of view of 40 degrees and a depth range of over 4 meters was demonstrated. In the future work, we will perform analytical designs and optimization of the InI-HMD optical system the goal of achieving wide field of view, high lateral and longitudinal resolution, as well as large depth of field. Acknowledgments This work is partially funded by National Science Foundation Grant Awards 1115489 and 0915035. The authors would like to acknowledge Dr. Sangyoon Lee and Dr. Jingang Wang for their contributions in preparing the elemental images for the experiments and Mr. Xinda Hu for assisting the experiments.

#207455 - $15.00 USD (C) 2014 OSA

Received 4 Mar 2014; revised 30 Apr 2014; accepted 1 May 2014; published 28 May 2014 2 June 2014 | Vol. 22, No. 11 | DOI:10.1364/OE.22.013484 | OPTICS EXPRESS 13491