Session: 16-01-01: Government Agency Student Poster Competition
Paper Number: 150336
150336 - 3d Reconstruction of Millet Plants Using Neural Radiance Fields
3D replicas or digital twins of real-world biological systems are becoming increasingly popular with the advancement of 3D capture technologies. These 3D reconstructions can be visualized in virtual reality (VR) to create an immersive environment that enables lifelike interactions. Although VR has been a mature technology in engineering design and games, its application in precision agriculture remains limited. However, using VR in agriculture can lead to new opportunities for timely data visualization and smart strategies for better crop breeding. In this work, we explore the 3D reconstruction and visualization of millet plants at different stages of their growth. We make use of Neural Radiance Fields to reconstruct the 3D point cloud of the plants.
We used an iPhone 14 Pro to capture detailed images and videos of millet plants in a greenhouse, which were then used to create a detailed 3D point cloud using NeRFs, which was then rendered in a VR headset. This approach enables researchers to capture and observe plant growth and development without being physically present, addressing the challenge of documenting rapidly growing plants. Moreover, it is a practical approach to exhibiting and documenting plants in 3D, providing a more comprehensive view than traditional 2D imaging methods. Using a phone is a cost-effective strategy, eliminating the need for expensive LiDAR devices and significantly reducing the time and effort required for data collection.
We employ Neural Radiance Fields (NseRFs) to generate detailed 3D point clouds of millet plants. NeRF is a method for representing detailed 3D scenes that can render realistic models from new viewpoints based on images captured from different views. For generating comprehensive 3D models, 360-degree photos and videos were captured from multiple heights by placing the plants on a turntable. Data was captured using both the phone's traditional back camera and the Polycam application. The captured images were then processed through COLMAP, an open-source photogrammetry software, to extract the pose information. Polycam leverages the built-in iPhone accelerometer and gyroscope to calculate the 3D poses. The image and pose data were then input into the NeRF training method 'NeRFacto' to reconstruct the scene. From the reconstructed scene, the 3D model of the plant represented as a point cloud is extracted. The point cloud was then cleaned using CloudCompare software and color-corrected. The results indicate that while traditional video and Polycam methods both produce high-quality models, Polycam performed better with bushier plants, such as the finger millet genotype, due to better pose estimation compared to COLMAP.
We then used Unreal Engine to develop a virtual greenhouse environment to showcase the growth stages for different millet genotypes grown in a controlled environment. In the virtual world, users can freely roam while wearing their headsets, inspect the plants for abnormalities or diseases, and learn more about them without being physically present. With the capabilities of NeRF reconstruction for lifelike scenes and the immersive nature of VR, this method is an excellent way to document growth stages, such as leaf expansion, flowering, and maturity. This efficient and cost-effective data capture and processing pipeline only requires a means to capture images and a PC with a GPU to process the NeRF, making it accessible to small-scale farmers and researchers, promoting widespread adoption, and facilitating advancements in precision agriculture.
Presenting Author: Shambhavi Joshi Iowa State University
Presenting Author Biography: Shambhavi Joshi is a Ph.D. candidate in the Department of Mechanical Engineering at Iowa State University. She holds a Bachelor's degree in Mechanical Engineering from the same institution. Her research focuses on using virtual reality (VR) and 3D imaging technologies in precision agriculture, specifically for monitoring and analyzing plant growth.
Shambhavi works with 360-degree imaging and video to create detailed and immersive 3D models of millet plants. Her goal is to develop cost-effective and efficient methods for data collection and analysis that can benefit small-scale farmers and researchers.
Authors:
Shambhavi Joshi Iowa State UniversityMozhgan Hadadi Iowa State University
Juan I. Di Salvo Iowa State University
Asheesh K Singh Iowa State University
Adarsh Krishnamurthy Iowa State University
3d Reconstruction of Millet Plants Using Neural Radiance Fields
Paper Type
Government Agency Student Poster Presentation