tensor flow deprecates alpha in LeakyRelu in favor of negative_slope
solution is probably just global search and replace things like this:
LeakyReLU(alpha=0.2)
with:
LeakyReLU(negative_slope=0.2)
just a note.
Thanks @stnava . We should probably try to start migrating to keras 3. Do you or @cookpa see any issues with this?
Nm. I think the move might still be a bit premature.
ChatGPT Summary ---- see point 1 ---- very interesting:
Keras 3 introduces several new features and improvements, making it a significant upgrade from its previous versions. Here are some of the key highlights:
-
Multi-Backend Support: Keras 3 acts as a "super-connector," allowing you to run your Keras workflows on TensorFlow, JAX, or PyTorch. This flexibility enables developers to choose the best tool for their specific tasks without changing the codebase【11†source】【13†source】.
-
Performance Optimization: By default, Keras 3 leverages XLA (Accelerated Linear Algebra) compilation, optimizing mathematical computations for faster execution on hardware like GPUs and TPUs. This allows for more efficient model training and experimentation【11†source】.
-
Expanded Ecosystem: Keras 3 supports a wide range of pretrained models across different backends, including models from Keras Applications, KerasCV, and KerasNLP. This includes popular models like BERT, T5, and YOLOv8【10†source】.
-
Cross-Framework Data Pipelines: Keras 3 enables seamless integration with various data loading and preprocessing frameworks, such as TensorFlow's
tf.data.Dataset, PyTorch'sDataLoader, NumPy arrays, and Pandas dataframes. This cross-framework compatibility fosters greater flexibility in model training【10†source】. -
Stateless API: Keras 3 introduces a stateless API for layers, models, metrics, and optimizers, which is particularly useful for JAX's requirement for stateless functions. This makes Keras more compatible with functional programming paradigms【10†source】【11†source】.
-
Progressive Disclosure of Complexity: Keras 3 maintains its user-friendly design by allowing users to start with simple workflows and progressively access more advanced features as needed. This design principle supports both beginners and advanced users, providing a smooth learning curve and flexibility in model development【11†source】【13†source】.
-
Improved Distributed Training: Keras 3 enhances distributed training capabilities with new APIs for data and model parallelism. This includes tools for sharding models across multiple devices, making it easier to train large models on distributed hardware setups【10†source】【14†source】.
These features collectively make Keras 3 a more versatile, efficient, and user-friendly deep learning framework. For detailed guides on migrating from Keras 2 to Keras 3 and utilizing the new features, you can refer to the official Keras documentation【12†source】.
THanks @stnava. Yeah, I'm sure it's great. I tried training with it last night but didn't get very far. But we should definitely start thinking about a migration strategy.
wonder if this would be more viable now.
I thought we already migrated to keras 3 with some work we did with @cookpa . On my machines, I'm showing keras 3.8.0. Is that not the case?
I dont know but we still get the warning from the original issue ie alpha should be negative_slope
I see---wrong issue. Yeah, I can fix that.
Okay, I think we can close this but re-open if needed.