Embodied Labs Raises $3.2 Million in Seed Funding For VR Healthcare Training With Aging-Related Illnesses

Embodied Labs Raises $3.2 Million in Seed Funding With Diverse Investors Across Aging, Tech and Impact Funds

Company’s Unique Immersive Platform Will be Showcased at Health 2.0 VentureConnect Event During J.P. Morgan Healthcare Conference – 

California-based Embodied Labs, the leading immersive platform using virtual reality (VR) to revolutionize training for the aging care workforce,  announced its seed funding of $3.2 million led by top age-tech investor, Ziegler Link·Age Fund, co-led by the leading immersive tech investor, The Venture Reality Fund, as well as SustainVC, a social impact fund in healthcare and education, WXR Fund, investing in women and the next wave of computing, and ETF@JFFLabs, a social impact fund that invests in technologies that close skill gaps and improve economic mobility.

“In addition to being oversubscribed in our initial investment round, we are thrilled with our unique mix of investors who have proven expertise across the convergence of aging, emerging technology, social impact, female empowerment and the need to transform our workforce training methods in health and aging care,” said Carrie Shaw, founder and CEO of Embodied Labs.

Founded in 2016, Embodied Labs uses the immersive experience of VR to put health care professionals and family caregivers into the body and mind of those who are challenged with lifespan aging issues: cognitive decline such as Alzheimer’s, age-related vision and hearing loss, neurodegenerative disease such as Parkinson’s and Lewy body dementia, and end-of-life decisions. Its training mission is twofold: enhance person-centered care through intellectual and instinctual behavior change and help long-term care providers recruit and retain a quality caregiver workforce.


Carrie Shaw, CEO of Embodied Labs, demos VR training platform for aging care providers

The entire age-tech market is projected to be $40 billion and according to a Goldman Sachs report, VR in U.S. health care will capture 12.5 percent of the market by 2025. From a workforce training perspective, ABI Research found enterprise training based on VR had a 75 percent learning retention rate versus lecture (10 percent) or reading (15 percent) training.

“One of the target attributes we look for in our investment portfolio companies is a differentiated solution with broad market potential,” said John Hopper, chief investment officer, Ziegler Link·age Fund. “We are leading this investment round for Embodied Labs because it delivers a unique solution across our limited partners spectrum of senior living, home care, hospice and hospitals to train a workforce that spans four generations. Using this innovative VR tool to attract and retain valuable talent puts providers ahead of the curve operationally and with their customers.”

“Embodied Labs is at the intersection of immersive tech and age-tech innovation to establish its leadership in this huge market opportunity,” said Marco DeMiroz, co-founder and general partner of The Venture Reality Fund. “What attracted us to invest in Embodied Labs is its unique focus in VR to deliver transformative training in a turnkey tech-sophisticated package based on its comprehensive and innovative platform that is already receiving significant adoption by a traditionally tech-phobic long-term care.”

“VR technology offers a quantum leap in delivering empathy and understanding of our ever-growing aging population,” said Dr. Ken Dychtwald, aging visionary, author, co-founder of AgeWave and an angel investor in Embodied Labs. “I became an early seed funder in Carrie Shaw’s Embodied Labs because they’re committed to using cutting edge technology to provide better care for our aging population.”

“Female-driven innovation and influence, especially in age-tech, is exciting to see and I immediately embraced Carrie’s vision to use VR to create emotional intelligence in aging for better person-centered care,” said Maddy Dychtwald, author, co-founder of AgeWave and a founder of WomenAgainstAlzheimer’s, who is also an angel investor in Embodied Labs.

Embodied Labs Showcased During J.P. Morgan Healthcare Week
Embodied Labs has been selected as one of only 12 finalists to present at the prestigious Health 2.0 VentureConnect event held during the J.P. Morgan Healthcare Conference in San Francisco the week of January 13. The event gathers highly vetted startups and venture capitalists in an exclusively curated showcase of the convergence of health care and technology.

About Embodied Labs

Embodied Labs, headquartered in Los Angeles, is the leading VR immersive learning platform on aging population health issues training home care, senior living, hospice, medical and nursing schools, hospitals and employers interested in educating HR departments to better support its caregiving employees.  Working collaboratively with health care, gerontology, medical and clinical scientist experts as well as seasoned Hollywood filmmakers for powerful storytelling, the Embodied Labs solution has won the National AARP Caregiving Innovation Challenge, The United Healthcare & AARP OpenIDEO Caregiving for Dementia Challenge, The US Department of Education EdSim Challenge and The GlobalXR in Education Prize Challenge funded by the Bill and Melinda Gates Foundation.

Verizon Develops 5G Edge Technology For VR, AR and MR

Verizon recently built and tested an independent GPU-based orchestration system and developed enterprise mobility capabilities for virtual reality (VR), mixed reality (XR), augmented reality (AR), and cinematic reality (CR). Together, these capabilities could pave the way for a new class of affordable mobile cloud services, provide a platform for developing ultra low-latency cloud gaming, and enable the development of scalable GPU cloud-based services.

GPU-Based orchestration system

5G technology and Verizon’s Intelligent Edge Network are designed to provide real time cloud services on the edge of the network nearest to the customer. Because of the heavy imaging and graphics that would benefit from this technology, many of these applications will run significantly better on a GPU. Artificial Intelligence and Machine Learning (AI/ML), Augmented, Virtual and Mixed Reality (AR/VR/XR), AAA Gaming, and Real-time Enterprise are highly dependent on GPUs for compute capabilities. The limited availability of efficient resource management in GPUs is a barrier to scalable deployment of such technologies.

To meet this need, the Verizon team developed a prototype using GPU slicing and management of virtualization that supports any GPU-based service and will increase the ability for multiple user-loads and tenants.

In proof of concept trials on a live network in Houston, TX using the newly developed GPU orchestration technology in combination with edge services, Verizon engineers were able to successfully test this new technology. In one test for computer vision as a service, this new orchestration allowed for eight times the number of concurrent users, and using a graphics gaming service, it allowed for over 80 times the number of concurrent users.

“Creating a scalable, low cost, edge-based GPU compute [capability] is the gateway to the production of inexpensive, highly powerful mobile devices,” said Nicki Palmer, Chief Product Development Officer. “This new innovation will lead to more cost effective and user friendly mobile mixed reality devices and open up a new world of possibilities for developers, consumers and enterprises.”

Edge application capabilities

To assist developers in creating these new applications and solutions, Verizon’s team developed a suite of edge capabilities. These capabilities, similar in nature to APIs (Application Programming Interface), describe processes that developers can use to build an application without the need for additional code. This eases the burden on developers and also creates more consistency across apps. Building on this technology, the team has created eight services for developers to use when creating applications and solutions for use on 5G Edge technology:

  1. 2D Computer vision – Users provide 2D images that a device can recognize and track.(Examples: A consumer may view a poster through glasses and 2D computer vision could be used to make it come alive. A consumer may view a movie poster and a trailer would automatically play. A consumer may view a product box and see an overlay of nutritional info, coupons, etc.)
  2. XR lighting – Currently when 3D objects are inserted into the real world they appear as 2D and can seem out of place. XR lighting can send back environment lighting and video info to reproduce a scene reflecting accurate lighting, shadows, roughness, reflections, and metallics on the 3D object so that it blends perfectly into the environment around it.
  3. Split rendering – Split rendering enables the delivery of PC/console level graphics to any mobile device. Split rendering splits the graphics processing workload of games, 3D models, or other complex graphics and pushes the most difficult calculations onto the server allowing the lighter calculations to remain on the device.
  4. Real time ray tracing – Traditional 3D scenes or objects are designed using complex custom algorithms to extrapolate or calculate how light will fall and how ending colors will look and feel. With real-time ray tracing capability, each pixel can receive accurate light coloring, greatly advancing the realism of 3D.
  5. Spatial audio – In the ongoing evolution of sound (from mono to stereo to 7.1 surround sound) spatial audio is the next step. This type of audio processing is extremely processor heavy. In designing for spatialized audio, objects in a 3D scene must react with the sound so that users have a true sense of the space and relative location of an object in an augmented reality environment. As audio is emitted it bounces off digital objects and based on directionality and where your head location is, reproduces what you would hear in the real world.
  6. 3D Computer vision – 3D computer visioning leads to 3D object recognition by training the edge network to understand what a 3D object is from all angles. (Examples: A football helmet could provide highlights, stats, etc. for players seen from any angle. At a grocery store, 3D computer visioning would allow a consumer using AR to recognize and respond to objects with unusual 3D shapes such as fruits and vegetables.)
  7. Real time transcoding – Transcoding is taking one file format and transferring it to another format. Footage is usually much larger in its raw format. Real time transcoding takes that away so a consumer doesn’t have to worry about what file format goes up and what comes down. Real time transcoding is a content creation tool that saves time and optimizes workflows.
  8. Asset caching – Asset caching provides for real time use of assets on the edge. It allows people to work collaboratively. (Example: Multiple people could work on a video file in real time altogether instead of handing it off to avoid overwriting each other’s work.) The fast file format for 5G and this caching optimization tool allow a limitless number of people to work on the same file in real time.

With the development of these new innovative technologies and more than a dozen patents pending, the Verizon team was awarded with the “Biggest contribution to Edge Computing R&D” at the International London Edge Conference. Now you can see some of this exciting technology in person. At Mobile World Congress Americas in LA next week, Verizon will demonstrate some of these new capabilities at the Verizon 5G Built Right booth. Some of the demos include:

  • Reimagined workspace with augmented reality: 1000 Augmented Realities + AR smart glasses map the workspace and render overlays and information displays.
  • Qwake Tech: AR helmet attachment allows firefighters to see in low visibility environments.
  • 3D handheld scanner: Volumetric scanner creates detailed renderings of objects and entire scenes.
  • Verizon AR Shopping: Instantly and seamlessly overlays digital displays on top of physical products.
  • Visceral Science: Educational experiences in VR of essential science concepts (i.e. lifecycle of a star) resonating with the middle and high school curricula.

“The future is now. We’re no longer simply talking about the possibilities of 5G and edge computing,” said Palmer. “The work our Verizon 5G Lab team is doing is pushing the envelope of innovation and leading our industry into a new day where the possibilities inherent in 5G technology are becoming reality.”

  • Verizon’s 5G Lab, which is comprised of both network and XR technologists, continues to combine 5G network and XR expertise to produce ground breaking capabilities for the edge.
  • The Verizon team recently developed a suite of enterprise XR technologies to drive emerging mobility experiences.
  • These technologies could pave the way for several new mobile cloud based applications and services and provide a platform for developing ultra low-latency cloud gaming, enabling the development of scalable GPU cloud-based services.
  • Demonstrations of some of these technologies will take place in the Verizon booth at Mobile World Congress Americas.

Apple Acquires UK Motion Capture Company iKinema

Documentation disclosed by MacRumors has found that an Apple executive came onto iKinema’s board of directors starting on Sept 9, 2019. Around the same time the company’s official address was changed to that of Apple’s law firm in London. On October 4th Apple officially confirmed the acquisition, further details of the aqcuistion were not disclosed.

What We Think

iKinema has been used on many films and games and now it will be interesting to see what plans Apple has for incorporating the company’s technology. All of the iKinema site has been removed but in previous press releases it describes itself and its list of clients as follows:

  • IKINEMA (www.IKINEMA.com) sells products that dramatically improve the quality of animation and reduce the cost of producing animation. It uses patented IP to dynamically calculate animation sequences. The resulting animations are more realistic and cheaper to produce and maintain. IKINEMA was established in 2006 in the UK and has actively traded since 2009. The company owns patent protected intellectual property for fast, realistic and organic animation for games, virtual, and movie production. Prominent clients include: Activision, Bethesda, Bluehole Inc., Capcom, CBS Digital, Deep Silver Dambuster Studios, Digital Domain, Digital Idea Corp., Disney, DreamWorks Animation, Electronic Arts, Epic Games, Foundry, Fox VFX, Framestore, Globo TV, GREE, Inc., Hasbro, Image Engine Design, Inc., Impulse Gear, Infinity Ward, Koch Media, Linden Lab, Microsoft Studios, NASA Johnson Space Center, NCSOFT, NVIDIA, Quantic Dream, Rare, Respawn Entertainment, Snail Games, Softstar Entertainment Inc., Square Enix, Sumo Digital, Technicolor, Tencent NEXT Studio, The VOID, 20th Century Fox, 2K Games, Ubisoft, Unity3D, Valve Software, Vicon, Warner Bros. and many more. The company is licensed technology provider by Microsoft, Sony, Nintendo and Autodesk. All products or brand names mentioned are trademarks or registered trademarks of their respective holders.

The ability to produce high-quality, low cost animation has been an issue the industry has been trying to solve for years. 3D and animation is still the most difficult form of media to create. Tools that better enable the democritizition of 3D content for developers is definately a valuable resource and Apple seems to have targeted that as something they look to capitalize on in the future.

Jaunt XR’s Technology Aquired by Verizon

Jaunt XR today announced the acquisition of its software, technology, and certain other assets by Verizon Communications Inc. (NYSE, Nasdaq: VZ) for an undisclosed amount. Jaunt XR is a leader in the immersive industry with a focus on the scalable creation and distribution of volumetric video of humans.
“We are thrilled with Verizon’s acquisition of Jaunt’s technology,” said Mitzi Reaugh, President & CEO of Jaunt XR. “The Jaunt team has built leading-edge software and we are excited for its next chapter with Verizon.”
Jaunt will be assisting Verizon with the transition of select portions of the software and technology for a brief period of time.
About Jaunt XR
Jaunt, Inc. (dba Jaunt XR) enables the scaled creation and distribution of volumetric video through machine learning. Jaunt has twice been selected for Fast Company’s World’s Most Innovative Companies and back-to-back selection for CNBC’s Disruptor 50 List. The company is headquartered in San Mateo, California.
Jaunt’s investors include Evolution Media Partners, CMC, Highland Capital Partners, Redpoint Ventures, SMG, Axel Springer, The Walt Disney Company, The Madison Square Garden Company, and Sky. Learn more at www.jauntxr.com.