cerebros-core-algorithm-alpha
cerebros-core-algorithm-alpha copied to clipboard
The Cerebros package is an ultra-precise Neural Architecture Search (NAS) / AutoML that is intended to much more closely mimic biological neurons than conventional neural network architecture strategi...
Challenges: 1. Making sure that if this is run in the same environment, it will not conflict the existing DB, e.g. overwriting instead of appending. 2. race condition #74
Kind of issue: enhancement / integration Question: do we make a direct integration in the Cerebros API or do we build the integration within the Cerebros Enterprise / Kale-Kubeflow templates?...
Kind of issue: minor security vulnerability (CVE-2024-5206) In cicd-requirements.txt scikit-learn needs upgraded > 1.5.0.
Kind of issue: Feature development Issue described: We have a successful implementation of a Ternary replacement for Dense layers. The metrics are not quite what we want on some problems....
Kind of issue: enhancement Issue described: Try using a Ternary operation layer instead of a Dense layer, e.g. Replace each occurrence of tf.keras.layers.Dense with a custom layer like this: ```python...
{ 'embedding_n': 12, 'activation': 'gelu', 'predecessor_level_connection_affinity_factor_first': 38.7, 'predecessor_level_connection_affinity_factor_main': 4.6, 'max_consecutive_lateral_connections': 32, 'p_lateral_connection': 17.1, 'num_lateral_connection_tries_per_unit': 1, 'learning_rate': 0.046500145665525995, 'epochs': 7, 'batch_size': 22, 'dropout': 0.6500000000000001, 'maximum_units_per_level': 7, 'maximum_neurons_per_unit': 7, 'temperature': 36912 #...
## TLDR: - Deletion of the duplicative `import tensorflow as tf` after we train the baseline GPT model, before we train the Cerebros model, appears to lower val_binary_accuracy on the...
It appears a phishing_email_detection_gpt2.py line 407 needs the zero mask set to False. This is an AI - introduced error. ``` embedded = tf.keras.layers.Embedding( input_dim=VOCABULARY_SIZE, output_dim=EMBEDDING_DIM, input_length=max_seq_length, mask_zero=True)(tokens) ```