eYs3D Microelectronics Raises $7 Million Series A from Industry Strategic Investors for Vision/AI Chips
May 26, 2021
eYs3D Microelectronics announced it has raised $7 million in Series A funding from strategic partners ARM IoT Capital, WI Harper Group, and MARUBUN CORPORATION.
The funding will allow eYs3D to grow its new product development for AI-based autonomous operation, including robotics, security, touchless control, autonomous vehicles, and smart retail.
The company demonstrated its 3D depth-sensing cameras and vision sensor integration solutions on May 25th at the virtual 2021 Embedded Vision Summit as a member of the Edge AI and Vision Alliance. The company showcased sensor use cases designed to permit new capabilities in object recognition, distance measurement, and more.
eYs3D was spun off from Etron Technology in 2016 and has provided the market with ICs and other sensory products. Per the company, the new investment will allow eYs3D to build out its embedded chip business in additional markets and bring the startup to the next level. Also, according to the company, investors like ARM will work with eYs3D to focus on the integration of its chips with ARM’s CPU/NPU processors, WI Harper Group will offer eYs3D access to its industrial partners base and ecosystem, and MARUBUN CORPORATION will join the funding round to open up new distribution channels for the company to deploy its solutions to businesses worldwide.
The computer vision market for AI is critical in enabling autonomous functionalities for software and machines, from robotic spatial awareness to scene understanding of edge devices. According to Meticulous Research, the 3D and machine vision market is expected to double from $1.35 billion in 2020 to $2.65 billion in 2027, a compound annual growth rate of 10.2%.
eYs3D designs processor ICs for AI at the edge. The technology allows for suitable human/machine coordination. In order to address specific growing markets such as artificial intelligence of things (AIoT) and mobile intelligence, eYs3D uses a silicon design approach with algorithms to integrate and manage information from differing sensor sources including thermal, active 3D sensing, and neural network perception. Per the company, this “sensor fusion” enables design of systems for applications incorporating visual simultaneous location and mapping (VSLAM), object feature depth recognition, and gesture-based commands.
The funds will be used for further product development, staff expansion, and marketing.
For more information, visit: here