Generating report looks broken (visually)
Generating report looks broken (visually)
See screenshot:
Reproducible example:
Full conda environment: conda env export (zipped):
conda_env.yaml.zip
Expected Behaviour
- There should be no red rectangles indicating error(s)
- The link should ideally go away, as it mistakenly suggests like we are using some old version and should upgrade (not talking about, the link is broken)
Data Description
Attached zip with MRE - minimalistic fully reproducible example.
Code that reproduces the bug
The minimalistic code & data are in the attached *.zip example.
from ydata_profiling import ProfileReport
pp = ProfileReport(df)
pp.to_file("Report.html")
pandas-profiling version
ydata-profiling=4.15.1
Dependencies
**Libs:**
ipywidgets=8.1.5
jupyterlab=4.3.5
ydata-profiling=4.15.1
Full conda environment: conda env export (zipped):
https://github.com/user-attachments/files/19428004/conda_env.yaml.zip
OS
MacOS: Sequoia 15.3.2
Checklist
- [x] There is not yet another bug report for this issue in the issue tracker
- [x] The problem is reproducible from this bug report. This guide can help to craft a minimal bug report.
- [x] The issue has not been resolved by the entries listed under Common Issues.
Hi @stefansimik,
thank you for opening a question/discussion in ydata-profiling project. Answering your questions:
- If there is an error during the process of generating the profiling there are 2 possibilities (the report is not generate at all and an exception is thrown or a warning is raised but the profiling is completed). It might be the case that one of the selected correlations is not valid for your dataset, and in that case the error is expected, but still that would raise a warning (which is not the case you are reporting).
- Thank you for identifying the small issue with the broken link. The banner will be kept, and can be used for users that want to access features that are not available in the OSS (overall data quality scores, outlier detection, missingness analysis, synthetic data generation, anonymization, among others)
Thank you @fabclmnt , regarding:
If there is an error during the process of generating the profiling there are 2 possibilities (the report is not generate at all and an exception is thrown or a warning is raised but the profiling is completed). It might be the case that one of the selected correlations is not valid for your dataset, and in that case the error is expected, but still that would raise a warning (which is not the case you are reporting).
I am not reporting an error, but the red rectangle (which looks like error, despite everything is OK). That means, some integration / cooperation how library works within JupyterLab environment is broken and valid processing generates red rectangles.
Rendering red rectangles in JupyterLab is clearly incorrect and misleading behavior (especially, when everything is OK) and that is, what should be fixed 😊 👍
I understood, nevertheless what I was trying to convey is that the red color can sometimes appear due to how TQDM integrates with JupyterLab’s output rendering — for example, if there are warnings or non-critical exceptions handled in the background, or simply due to how the widget styles are interpreted. It's more of a visual quirk than a sign of failure.
In this case, it seems to be something specific with your dataset, having that said, hard to reproduce on our end and potentially not something that that needs to be fixed on the package itself (as I was mentioning the choice of correlation matrix might not be valid for your dataset).
Hope it is more clear now.
Also, in case these red rectangles are caused by underlying warnings or non-critical exceptions in the background, it’s still the responsibility of the library to handle (or suppress) them gracefully—if the library considers everything to be in a valid state. Otherwise, this is the result.
If the correlation matrix is indeed invalid, then a better approach would be to simply hide it or display a clear, user-friendly message explaining why it can’t be shown. That would be a valid solution—rather than propagating raw exceptions to the user.
That said, if you believe this current behavior is acceptable and user-friendly, I’m fine with it too.
Just to clarify and provide more context on my previous messages - the warnings are already being handled internally by the package. These are non-breaking and are processed gracefully as part of the profiling workflow. As such, they do not interfere with the report generation, meaning they won't stop its generation but we can't suppress them otherwise we would have a bad user experience, meaning the user wouldn't be informed why a certain metric was not computed.
Regarding the red TQDM progress bar — this isn't tied to an error or an issue with the profiling itself. It's related to how TQDM interprets output streams. A few reasons why it might appear red:
-
Progress logged via stderr: ydata-profiling (and some of its dependencies) log progress or messages via stderr, particularly during multiprocessing. TQDM detects this and renders the progress bar in red — this is purely cosmetic and not indicative of an error.
-
Warnings during profiling: Warnings like DtypeWarning or InvalidMatrixWarning are captured and managed within the package. They're surfaced to inform users when specific metrics couldn't be computed, but they don't disrupt execution. However, since they're routed through stderr, they can still trigger the red bar in TQDM — which again, is just visual behavior, not a sign of failure.
-
Multiprocessing output: Since the package uses multiprocessing by default, output from worker processes often gets funneled through stderr. This can also contribute to the red TQDM bar.
So while the red progress bar can look alarming, it’s not a sign of any problem — just a side effect of how TQDM displays output under the hood. If you’ve noticed a specific issue with a metric or visualization, feel free to share the details — we’d be more than happy to investigate further.
Thank you very much for all the explanations ❤ and feel free to close this issue 😊 👍