Research Article
Methods and Strategies for 3D Content Creation Based on 3D Native Methods
@ARTICLE{10.4108/airo.5320, author={Shun Fang and Xing Feng and Yanna Lv}, title={Methods and Strategies for 3D Content Creation Based on 3D Native Methods}, journal={EAI Endorsed Transactions on AI and Robotics}, volume={3}, number={1}, publisher={EAI}, journal_a={AIRO}, year={2024}, month={12}, keywords={3D Content Creation, Point-E, 3DGen, Shap-E, 3D Generation}, doi={10.4108/airo.5320} }
- Shun Fang
Xing Feng
Yanna Lv
Year: 2024
Methods and Strategies for 3D Content Creation Based on 3D Native Methods
AIRO
EAI
DOI: 10.4108/airo.5320
Abstract
The present paper provides a comprehensive overview of three neural network models, namely Point·E, 3DGen, and Shap·E, with a focus on their overall processes, network structures, loss functions, as well as their strengths, weaknesses, and potential future research opportunities. Point·E, an efficient framework, generates 3D point clouds from complex text prompts, leveraging a text-to-image diffusion model followed by 3D point cloud creation. 3DGen, a novel architecture, integrates a Variational Autoencoder with a diffusion model to produce triplane features for conditional and unconditional 3D object generation. Shap·E, a conditional generative model, directly generates parameters of implicit functions, enabling the creation of textured meshes and neural radiance fields. While these models demonstrate significant advancements in 3D generation, areas for improvement include enhancing sample quality, optimizing computational efficiency, and handling more complex scenes. Future research could explore further integration of these models with other techniques and extend their capabilities to address these challenges.
Copyright © 2024 Author et al., licensed to EAI. This is an open access article distributed under the terms of the CC BY-NC-SA 4.0, which permits copying, redistributing, remixing, transformation, and building upon the material in any medium so long as the original work is properly cited.