house_diffusion
house_diffusion copied to clipboard
The implementation of "HouseDiffusion: Vector Floorplan Generation via a Diffusion Model with Discrete and Continuous Denoising", https://arxiv.org/abs/2211.13287
Hi @aminshabani Thanks for working on this research! I have a few questions: 1) I can't understand when the model stop training when I run: `python image_train.py --dataset rplan --batch_size...
Hi! When I am running image_sample.py script it is stucking here  Does anyone have a clue how to solve this problem?
What might be this error? ``` (myenv) root@eb3fe363e935:/workspace/house_diffusion/scripts# python image_train.py --dataset rplan --batch_size 32 --set_name train --target_set 6 Logging to ckpts/openai_2024_04_06_20_34_10_584347 creating model and diffusion... Number of model parameters: 26541330...
Hi When i run `python scripts/image_sample.py --dataset rplan --batch_size 32 --set_name eval --target_set 8 --model_path ckpts/exp/model250000.pt --num_samples 64 ` i get the following error: `FileNotFoundError: [Errno 2] No such file...
how to solve it?
[Errno 2] No such file or directory: 'processed_rplan/rplan_train_8_cndist.npz'
This error occurs when I copy image_train.py to the same level of the main directory ./house_diffusion/image_train.py ``` $ python3.11 image_train.py --dataset rplan --batch_size 32 --set_name train --target_set 8 Traceback (most...
Want to add some contour control to the generated planar graph, such as input contour and bubble constraint, generate planar graph, is there any good way?
I get the following error due to inhomogeneous shampes in graphs. Did anyone else face this? How do I solve it? File "/home/roshan/Documents/TextToSchematics/house_diffusion/house_diffusion/rplanhg_datasets.py", line 243, in __init__ np.savez_compressed(f'processed_rplan/rplan_{set_name}_{target_set}', graphs=self.graphs, houses=self.houses,...
 Hello, our current python version is: 3.8, torch version: 2.0.0, cuda version: 11.8. There is no problem with the environment configuration and JSON generation, but run python image_train.py --dataset...