Genesis icon indicating copy to clipboard operation
Genesis copied to clipboard

Generative data performance

Open yic03685 opened this issue 1 year ago • 6 comments

First of all, great project and incredible potential. I have questions regarding to the generative data. In the doc, you mentioned the IK simulation performance is super realtime. How about the generation itself? For example, to generate the character motion, what was the performance. Also is the simulation of the character motion after the generation super real time as well?

yic03685 avatar Dec 22 '24 20:12 yic03685

it varies across modalities, but basically generation and simulation is fast -- the biggest bottleneck is photo-realistic rendering, as we are using pure ray-tracer for now, and this is one of the main reason that we are not ready for open-sourcing the generative part. We are assembling an engineering team to re-build another rendering framework (ray-tracer augmented rasterization) to achieve similar visual quality but way faster speed

zhouxian avatar Dec 22 '24 20:12 zhouxian

Thanks for the quick response. Does this mean, except the photo realistic rendering, the generation and the simulation of the underlying geometry is close to realtime?

yic03685 avatar Dec 22 '24 20:12 yic03685

as i said, it depends on the actual scene and modality:

  • the beer bottle demo: faster than real time
  • a scene with ~20000 particles: ~ real time
  • a scene with 10m particles: much slower than real time (but can be accelerated using a different representation)
  • the motion generation: should be fast as the motion is directly generated using a neural module, but i'll have to double check
  • interactive scene generation: all generated at once, and takes a few minutes i think

zhouxian avatar Dec 22 '24 20:12 zhouxian

Got it! What is the data representation of the character motion in the demo? For example, Is Wukong represented by particles? Or there’s traditional 3D models generated frame by frame? Though the generative function is not released yet, I’m interested how that’s being plugged in the underlying physics and the rendering engine.

yic03685 avatar Dec 22 '24 20:12 yic03685

It’s a rigged mesh and we generate the motion graph

zhouxian avatar Dec 22 '24 20:12 zhouxian

I see. So the input is the rigged mesh and the engine generate the motion of the control points and it’s close to real time depending on the use case.

yic03685 avatar Dec 22 '24 20:12 yic03685

这是一个操纵的网格,我们生成了运动图> 这是一个操纵的网格,我们生成了运动图 你好,悟空这个实体添加,在那个位置,我没有找到哦

dmhaomoon avatar Dec 23 '24 02:12 dmhaomoon

明白了!演示中角色运动的数据表示是什么?例如,悟空是由粒子表示的吗?还是有逐帧生成的传统3D模型?虽然生成函数尚未发布,但我对它是如何插入底层物理和渲染引擎的感兴趣。>明白了!演示中角色运动的数据表示是什么?例如,悟空是由粒子表示的吗?还是有逐帧生成的传统3D模型?虽然生成函数尚未发布,但我对它是如何插入底层物理和渲染引擎的感兴趣。 9/1000 实时翻译 请问悟空的例子是什么

划译 May I ask which example of Wukong is

dmhaomoon avatar Dec 23 '24 05:12 dmhaomoon

There’s one in the gallery showing character motion generation

yic03685 avatar Dec 23 '24 07:12 yic03685