Chase
Chase
Enhancement to #6 - right now syncing a bpy.type.Image to Unity happens under the following conditions: 1. Whenever a SpaceImageEditor redraws (via `draw_handler_add`) and the draw tool is active 2....
I would like to be able to edit a texture directly within Blender and pipeline those pixels into Unity - the same way viewport pixels pipeline from Unity. Few concerns...
Related to #13 Right now all data synced from Blender is transient - gets added to the `[Blender Sync]` GO group and goes away once we disconnect from Blender. To...
With my mech test scene - I notice a few performance issues that could be improved on: - During initial scene import to Unity it dumps all the geometry into...
The sample mech file I'm trying to stress test with has two mains scenes - one contains the actual mech geometry and the other contains a reference to that for...
When zooming in/out with an orthogonal camera, objects are culled improperly. This is because the camera position is also moving towards/away from origin (not just `.orthogonalSize` changes) which will cause...
This is mostly on the Unity side. I already send Unity the list of objects visible from each viewport camera. I just don't know how to handle this within Unity....
Kind of a hard issue to tackle - I want proper change tracking on data and only send the fragments that have modified. This should cover: * Only send (and...
Ideas for enhancing performance: * The VP the user is actively working in within Blender should get full resolution / sync priority. Additional viewports would be updated slightly less (I'm...
I'd love to be able to run a simulation on Blender, see it in Unity and then bake it immediately into an asset on the Unity side (FBX or whatever...