AiDotNet icon indicating copy to clipboard operation
AiDotNet copied to clipboard

Fix issue 416 for GANs and info

Open ooples opened this issue 2 months ago • 1 comments

User Story / Context

  • Reference: [US-XXX] (if applicable)
  • Base branch: merge-dev2-to-master

Summary

  • What changed and why (scoped strictly to the user story / PR intent)

Verification

  • [ ] Builds succeed (scoped to changed projects)
  • [ ] Unit tests pass locally
  • [ ] Code coverage >= 90% for touched code
  • [ ] Codecov upload succeeded (if token configured)
  • [ ] TFM verification (net46, net6.0, net8.0) passes (if packaging)
  • [ ] No unresolved Copilot comments on HEAD

Copilot Review Loop (Outcome-Based)

Record counts before/after your last push:

  • Comments on HEAD BEFORE: [N]
  • Comments on HEAD AFTER (60s): [M]
  • Final HEAD SHA: [sha]

Files Modified

  • [ ] List files changed (must align with scope)

Notes

  • Any follow-ups, caveats, or migration details

ooples avatar Nov 08 '25 01:11 ooples

[!WARNING]

Rate limit exceeded

@ooples has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 1 minutes and 32 seconds before requesting another review.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

📥 Commits

Reviewing files that changed from the base of the PR and between 50d865177330a5f665ee942e7673d410b50e7b74 and a591082d55c920bd4f270e3df88fe40869814cec.

📒 Files selected for processing (5)
  • src/LossFunctions/WassersteinLoss.cs (1 hunks)
  • src/Metrics/InceptionScore.cs (1 hunks)
  • src/NeuralNetworks/ACGAN.cs (1 hunks)
  • src/NeuralNetworks/ConditionalGAN.cs (1 hunks)
  • src/NeuralNetworks/WGAN.cs (1 hunks)

[!NOTE]

Other AI code review bot(s) detected

CodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review.

Walkthrough

Adds comprehensive GAN support: new documentation, 13 GAN implementations, supporting layers/metrics (SpectralNorm layer, FID, IS), ModelType enum entries, GAN base refactor to accept injected optimizers, and unit tests including a MockLoss helper. No public APIs removed.

Changes

Cohort / File(s) Summary
Documentation
docs/GANs_Implementation_Guide.md
New comprehensive AIDotNet‑centric GAN guide covering architectures, layers, metrics, examples, training tips, references, contribution and license notes.
Enums
src/Enums/ModelType.cs
Added enum members for GAN/model types (DCGAN, WassersteinGAN, WassersteinGANGP, ConditionalGAN, AuxiliaryClassifierGAN, InfoGAN, StyleGAN, ProgressiveGAN, BigGAN, CycleGAN, Pix2Pix, SAGAN, etc.).
Metrics
src/Metrics/FrechetInceptionDistance.cs, src/Metrics/InceptionScore.cs
New FID and Inception Score implementations: feature extraction integration, statistics computation, split handling, and combined metrics API.
Layer
src/NeuralNetworks/Layers/SpectralNormalizationLayer.cs
New SpectralNormalizationLayer wrapping an inner layer and applying power‑iteration spectral normalization during forward/backward; delegates parameter ops and disallows JIT export.
GAN base refactor
src/NeuralNetworks/GenerativeAdversarialNetwork.cs
Constructor now takes InputType and optional generator/discriminator IGradientBasedOptimizer instances; removed inline Adam state in favor of injected optimizers; added CreateGANArchitecture helper and stricter noise validation.
Classic / Wasserstein GANs
src/NeuralNetworks/DCGAN.cs, src/NeuralNetworks/WGAN.cs, src/NeuralNetworks/WGANGP.cs
Added DCGAN (builders), WGAN (critic + weight clipping), WGANGP (gradient penalty, multi‑critic updates) with generation, training, evaluation, and serialization hooks.
Conditional / Auxiliary / InfoGAN
src/NeuralNetworks/ConditionalGAN.cs, src/NeuralNetworks/ACGAN.cs, src/NeuralNetworks/InfoGAN.cs
Added ConditionalGAN, ACGAN, InfoGAN (with Q‑network and mutual information) including TrainStep APIs, conditional generation utilities, optimizer integration, and persistence.
Image‑to‑Image GANs
src/NeuralNetworks/Pix2Pix.cs, src/NeuralNetworks/CycleGAN.cs
Added Pix2Pix (U‑Net + PatchGAN) and CycleGAN (dual generators/discriminators, cycle/identity losses), with Translate/TrainStep flows and serialization.
Advanced / Large‑scale GANs
src/NeuralNetworks/StyleGAN.cs, src/NeuralNetworks/ProgressiveGAN.cs, src/NeuralNetworks/BigGAN.cs, src/NeuralNetworks/SAGAN.cs
Added StyleGAN, ProgressiveGAN, BigGAN, SAGAN: mapping/synthesis networks, progressive growth, embeddings/truncation, self‑attention/spectral‑norm, training/generation/metadata APIs.
Tests & Mocks
tests/.../GenerativeAdversarialNetworkTests.cs, tests/.../Helpers/MockLossFunction.cs
New unit tests covering GAN variants and a MockLossFunction<T> test helper for deterministic loss/derivative behavior and call tracking.

Sequence Diagram(s)

sequenceDiagram
    participant Trainer
    participant Generator
    participant Discriminator
    participant Optimizer_G as Optimizer(Generator)
    participant Optimizer_D as Optimizer(Discriminator)
    participant Metric

    rect rgb(245,250,255)
    Note over Trainer,Metric: High‑level GAN training iteration (supports conditional/aux/paired/dual networks)
    end

    Trainer->>Generator: sample noise (+ optional conditioning/class embeddings)
    Generator-->>Trainer: generated images
    Trainer->>Discriminator: forward(real images / real conditions)
    Discriminator-->>Trainer: real logits (+ aux outputs)
    Trainer->>Discriminator: forward(generated images / fake conditions)
    Discriminator-->>Trainer: fake logits (+ aux outputs)
    Trainer->>Trainer: compute losses (adv + aux/MI/cycle/L1/style)
    Trainer->>Optimizer_D: apply discriminator gradients
    Optimizer_D-->>Discriminator: update params
    Trainer->>Optimizer_G: apply generator gradients
    Optimizer_G-->>Generator: update params
    Trainer->>Metric: optionally compute IS/FID (feature extraction via network)
    Metric-->>Trainer: return scores

Estimated code review effort

🎯 5 (Critical) | ⏱️ ~120+ minutes

  • Areas needing careful review:
    • Multi‑network training & combined gradient flows: ACGAN.cs, InfoGAN.cs, CycleGAN.cs, StyleGAN.cs, BigGAN.cs, ProgressiveGAN.cs, SAGAN.cs
    • Optimizer injection and consistent UpdateParameters/serialization across composite models
    • Numerical routines: matrix square root / Newton‑Schulz in FrechetInceptionDistance.cs, KL/divergence stability in InceptionScore.cs
    • SpectralNormalization weight reshape/restore and side‑effects in SpectralNormalizationLayer.cs
    • Parameter vector assembly/splitting order in multi‑subnetwork models and corresponding tests

Possibly related PRs

  • ooples/AiDotNet#222 — optimizer lifecycle changes; likely strongly related to the injected-optimizer refactor in this PR.
  • ooples/AiDotNet#422 — overlapping GAN subsystem changes; likely touches similar GAN classes and architecture wiring.

Poem

🐇
I hopped through tensors, kernels, and bright noise,
wrapped weights in norms and tuned the latent joys.
Thirteen gardens bloom — generators hum and spin,
I nibble bugs and hug the code; new art begins! 🥕

Pre-merge checks and finishing touches

❌ Failed checks (2 warnings, 1 inconclusive)
Check name Status Explanation Resolution
Description check ⚠️ Warning The PR description is entirely a template with placeholder text and does not contain any meaningful information about what was changed, why it was changed, or how it relates to issue 416. Provide a concrete description detailing what changes were made, the rationale for implementing these GAN architectures, how they address issue 416, and summarize the key components added.
Docstring Coverage ⚠️ Warning Docstring coverage is 74.40% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
Title check ❓ Inconclusive The title 'Fix issue 416 for GANs and info' is vague and does not clearly describe the main changes, which involve adding 13 GAN architectures, metrics, and supporting components to the codebase. Replace with a more specific title that clearly describes the primary change, such as 'Add GAN architectures (DCGAN, WGAN, WGAN-GP, cGAN, AC-GAN, InfoGAN, Pix2Pix, CycleGAN, StyleGAN, ProgressiveGAN, BigGAN, SAGAN) with metrics and supporting layers' or similar.

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

coderabbitai[bot] avatar Nov 08 '25 01:11 coderabbitai[bot]

Quality Gate Failed Quality Gate failed

Failed conditions
17.1% Coverage on New Code (required ≥ 80%)
9.1% Duplication on New Code (required ≤ 3%)
C Reliability Rating on New Code (required ≥ A)

See analysis details on SonarQube Cloud

Catch issues before they fail your Quality Gate with our IDE extension SonarQube for IDE

sonarqubecloud[bot] avatar Dec 14 '25 17:12 sonarqubecloud[bot]