Results 342 comments of François Chollet

> I want to save the trainable lora-related parameters instead of the whole model. How to do this? Assuming this is with a KerasNLP backbone, you can do: ```python backbone.save_lora_weights(filepath)...

A simpler solution to your problem would be: 1. Instantiate the new Dense layer, e.g. `dense = Dense.from_config(...)`. (It doesn't have weights at that time) 2. Set `dense.kernel = old_layer.kernel`,...

The setter thing turned out to be problematic. What I would recommend is just direct setting but use `._kernel` instead of `.kernel`. Ref: https://github.com/keras-team/keras/pull/19469

Please take a look at the test failures.

Sorry for the delay in merging, this was pending due to a failing test. Re-running now.

Closing inactive stale PR -- feel free to reopen a new one.

The reason has to do with parallelization. Conditional branches are much harder to handle than a single serial op even if it has more flops. @haifeng-jin can you advise here...

Thanks, Haifeng! @jackd can you use the same code to benchmark the impact of this change?