cpp_weekly icon indicating copy to clipboard operation
cpp_weekly copied to clipboard

In absence of any virtual functions, not having a virtual destructor could lead to memory leaks

Open Gajoo opened this issue 10 months ago • 15 comments

Channel

C++ Weekly

Topics

  • Virtual destructor
  • Inheritance pitfall
  • Security

Length

Short.

It came out today as I was talking with a college. The following code results in a memory leak, and no compiler (msvc, cl, clang) would even warn about it even with all warnings enabled. This is super bad because from what I remember, I was always told that if you have at least one virtual function, you need to have a virtual destructor.

#include <iostream>

struct tester
{
    tester()
    {
        mem = new char[50];
        std::cout << "memory allocated\n";
    }
    ~tester()
    {
        delete []mem;
        std::cout << "memory freed\n";
    }
    char* mem;
};

struct A{};

struct B: public A{
    tester t;
};

int main()
{
    A* a = new B;
    delete a;
}

Gajoo avatar Jan 23 '25 05:01 Gajoo

sanitizer catches the issue at runtime: https://godbolt.org/z/c6v6fxesv but strange indeed that when declaring type B you are not warned about the missing virtual destructor. That seems easy to detect... Also there might be side effects on what implicit operators are inhibited when declaring a destructor though. From memory, I think that it disables implicit move operators.

Dharmesh946 avatar Jan 24 '25 13:01 Dharmesh946

Technically yes, but it's just capturing the memory allocation problem, It doesn't seem to care about the correct destructor not being called. It could very well be an device de-initialization code that is not getting executed because of this issue and that most certainly won't be picked up with any sanitizer.

Gajoo avatar Jan 27 '25 20:01 Gajoo

NB it is not possible to warn at the base class level, because, without any virtual function, the compiler has no way to know it will be used as a base class and, thus, is missing a virtual destructor. It could, though, on a derived class (at least warning that the base class is not polymorphic). Yet it might issue lots of useless warning, for instance for base classes that only store trivial data. Perhaps if we pull on the thread, there are more subtle cases where having such a warning would be an issue (with -Werror) or even not feasible?

My own approach is to make my classes explicitly inheritable or not inheritable (among other properties) but it's more work on the coder side...

Dharmesh946 avatar Jan 28 '25 08:01 Dharmesh946

@lefticus Just wanted to bump this up as it seems the issue has been buried after such a long time. It would be nice if the compilers would generate a warning at the very least for inherited class.

ali1s232 avatar Sep 17 '25 06:09 ali1s232

Oh I do what I can with these! I'll mark it and see when I might come back to it.

Problem is, I'll probably come out with an episode that disagrees with you. It's a HUGE performance and binary hit to unnecessarily add virtual destructors. Your type is no longer trivial if it could be.

So there's no way I would ever suggest adding virtual destructors to all class just in case someone decides to inherit from it later.

lefticus avatar Oct 01 '25 19:10 lefticus

There seems to be a disconnect here. I do understand of all the issues that would be caused by forcing virtual destructors. All I'm asking is for the compiler to recognize the generated default destructor will result in non-trivial destruction when parsing the inherited class definition and issue a warning there.

Gajoo avatar Oct 01 '25 20:10 Gajoo

It would warn everywhere though, this situation comes up all the time in perfectly normal safe C++

LB-- avatar Oct 05 '25 02:10 LB--

@LB-- I only know a handful of code-bases (like unreal engine) that have have non-trivially destructible classes without a virtual destructor, from which another class is inherited. If the deletion actually happens from the base class (as the example here) we are talking about an undefined behavior.

Even without deletion, in presence of any virtual function in the base class most compilers warn when parsing the base class about the potential problem.

What I'm suggesting though is different, I want the compiler to warn during the declaration of the inherited class. Or during assignment of pointer from child class to a variable of type base-class.

Gajoo avatar Oct 05 '25 02:10 Gajoo

It is common practice to inherit from non-polymohprhic classes, e.g. for CRTP, disabling copy/move, invasive linked lists, explict object parameter helpers, and more. In the majority of those cases, the deriving class won't be trivially destructible. It would be a warning everyone turns off because of too many false positives. Slicing and pointer/reference conversions are also safe. These objects are not used in a manner where they could accidentally be destructed via a base class. What could get a warning is attempts to transfer ownership which slice, such as by std::move, but not all such cases can be detected reliably, and either way that is also still safe. The only time it's dangerous is when the objects are dynamically allocated in something like a unique_ptr that blindly allows conversions to base class, but even that is unlikely to pose a risk in practice because the base class types are not what you would be using for smart pointer instantiations.

LB-- avatar Oct 06 '25 01:10 LB--

  • Maybe it's just game development that's my line of work, but CRTP is not common.
  • For disabling copy/move you never inherit, otherwise it's code smell. you define them with deleted keyword (or define private/protected)
  • There is no reason what so ever that a std::move or any other function involving a move, would result in a warning. I'm suggesting a warning to be generated as parsing the inherited class definition.
  • If you are sure the base class is just an interface and no one can delete from the base class, you can always define the destructor as protected and defaulted.

The main issue here is the people new to C++. There is a reason "virtual destructor" is the topic of many interviews, and guess what? the most common answer is "if there is any virtual function, define a virtual destructor". I've seen only a few senior developers who would raise the problem of "not having virtual destructor" if the inherited class has "user-defined-destructor", but I've yet to see anyone who actually sees the code above and immediately goes, hey, this object here has non-trivial default destructor.

Gajoo avatar Oct 06 '25 02:10 Gajoo

Maybe it's just game development that's my line of work, but CRTP is not common.

I also work in game development, I don't see how that relates to coding practices. CRTP is well known and common in the C++ world.

For disabling copy/move you never inherit, otherwise it's code smell. you define them with deleted keyword (or define private/protected)

Many codebases pre-date the ability to = delete, the pattern of using inheritance for this was so well known that it was in the Boost library.

There is no reason what so ever that a std::move or any other function involving a move, would result in a warning. I'm suggesting a warning to be generated as parsing the inherited class definition.

You wouldn't expect a move from a std::unique_ptr<Derived> to std::unique_ptr<Base> to potentially trigger the warning if Base didn't have a virtual destructor? I think that's one major case that could easily be checked with few worries of false positives.

If you are sure the base class is just an interface and no one can delete from the base class, you can always define the destructor as protected and defaulted.

This is not always an option, because it changes how the class can be used: giving it a destructor, even a defaulted one, disables copy/move. If you try to add those back, even defaulted, it breaks aggregate initialization and structured bindings. You can re-implement those manually, but there is nothing you can do to get back designated initializers. Sometimes it is better to follow the rule of zero.

The main issue here is the people new to C++. [...] I've yet to see anyone who actually sees the code above and immediately goes, hey, this object here has non-trivial default destructor.

If your concern is people new to C++, then you should be with the rest of us in advocating against using new/delete and against unnecessary dynamic memory allocations. Your example in the original post is very unnatural C++ and is not something I would expect a beginner to ever encounter. Converting pointers between non-polymorphic classes related by inheritance is generally only done in advanced C++ for implementing things like type erasure.

LB-- avatar Oct 07 '25 13:10 LB--

CRTP is well known and common in the C++ world.

Being well known != common. Yes there are always a few classes here and there that have that structure, but for 99% of classes within the game engine CRTP is not a thing.

Many codebases pre-date the ability to = delete, the pattern of using inheritance for this was so well known that it was in the Boost library.

again defining the destructor as protected would solve the potential issue. By the way = delete is as old as move

You wouldn't expect a move from a std::unique_ptr<Derived> to std::unique_ptr<Base> to potentially trigger the warning if Base didn't have a virtual destructor?

Two points here:

  • If you have a std::unqiue_ptr<Base> and you rely on the automatic memory deallocation of std:unique_ptr to come to your rescue, that would actually result in exactly the same bug that I'm suggesting goes under the Radar.
  • Regardless, I'm suggesting the warning should be generated when parsing the class Derived. std::uniqe_ptr<Derived> only has a pointer to Derived and doesn't participate in class hierarchy of Base -> Derived, so it shouldn't be an issue

This is not always an option, because it changes how the class can be used: giving it a destructor, even a defaulted one, disables copy/move.

Maybe I'm missing something, but as far as I tested both default move and default copy are present even in presence of non-default destructor (check the code below). By the way didn't you just suggest it's a good idea to use inheritance with "protected/private delete" that was used in boost? pick your battle...

If your concern is people new to C++, then you should be with the rest of us in advocating against using new/delete

Surely you realize the point of the exercise is not the new/delete part. Replace the raw pointer with any other data structure. Even delete the member variable if you are using destructor for it's side-effects, for example file closure. The point of the example is to show the destructor of your B class is not getting called. In original example manually allocated memory is getting leaked, but if you replace it with std::string the memory allocated by string is going to get leaked.


#include <memory>
#include <print>
#include <string>

struct tester
{
    tester(const char* in_name) noexcept : name(in_name)
    {
        std::println("memory allocated -> {}", name);
    }
    tester(const tester &other) noexcept : name(other.name + "_c")
    {
        std::println("copied -> {}", name);
    }
    tester(tester &&other) noexcept : name(std::move(other.name) + "_m")
    {
        std::println("moved -> {}", name);
    }
    tester &operator =(const tester &other) noexcept
    {
        name = other.name;
        std::println("copy-assed -> {}", name);
        return *this;
    }
    tester &operator =(tester &&other) noexcept
    {
        name = std::move(other.name);
        std::println("move-assed -> {}", name);
        return *this;
    }
    ~tester() noexcept
    {
        std::println("memory freed -> {}", name);
    }

    std::string name;
};

struct A{
    int x,y;

    ~A(){
        std::println("~A");
    }
};

struct B: public A{
    tester t;
};

void test_val(B val)
{
    std::println("{}.x -> {}", val.t.name, val.x);
}

int main()
{
    std::unique_ptr<B> b {new B{0,0, "B"}};
    std::unique_ptr<A> a {new B{1,1, "A"}};
  
    B c{std::move(*b)};
    c.y = 5;
    test_val(c);
}

The code above will produce the following result:

memory allocated -> B
memory allocated -> A
moved -> B_m
copied -> B_m_c
B_m_c.x -> 0
memory freed -> B_m_c
~A
memory freed -> B_m
~A
~A
memory freed -> 
~A

  • Note that B (and it's members) are actually moved.
  • Note that destructor for A was called 4 times, but destructor of the tester is only called 3 times. suggesting a leak.

Heck you don't even need a tester class, in the example below the Unaware class still is going to leak the memory because the destructor of leaker is not getting called.


#include <memory>
#include <print>
#include <string>

struct A{
    int x,y;

    ~A(){
        std::println("~A");
    }
};

struct Unaware: public A{
    Unaware() {
        std::println("Unaware");
        leaker.reserve(1024ul * 1024ul);
        }
    ~Unaware() {std::println("~Unaware");}
    std::string leaker;
};

int main()
{
    std::unique_ptr<Unaware> b = std::make_unique<Unaware>();
    std::unique_ptr<A> a = std::make_unique<Unaware>();
}

This code will output the following:

Unaware
Unaware
~A
~Unaware
~A

which suggests the memory for the string is allocated by never deleted;

Gajoo avatar Oct 07 '25 17:10 Gajoo

If you have a std::unqiue_ptr and you rely on the automatic memory deallocation of std:unique_ptr to come to your rescue, that would actually result in exactly the same bug that I'm suggesting goes under the Radar.

Yes, that is why I brought it up and said it should be checked (e.g. generate a warning or error) in the very next sentence. There is no valid use case for it. With raw pointers or absence of dynamic allocation though, there are valid use cases.

Regardless, I'm suggesting the warning should be generated when parsing the class Derived. std::uniqe_ptr<Derived> only has a pointer to Derived and doesn't participate in class hierarchy of Base -> Derived, so it shouldn't be an issue

It's very much an issue, because std::uniqe_ptr<Base> can be implicitly initialized by std::unique_ptr<Derived> and currently results in no warnings. As I mentioned above, there is no valid use case for this behavior. With a raw pointer however, the intent could be to store it for later and then cast it back to the correct type to put it back in a correctly typed smart pointer, as is very often done in type erasure. In constexpr contexts it is not allowed to cast from void* so a common trick is to derive from an empty base instead and cast from that pointer, which is allowed. With a smart pointer though, it's too easy to accidentally let the memory leak, since at a glance it looks taken care of. Raw pointers get people to tread more carefully.

Maybe I'm missing something, but as far as I tested both default move and default copy are present even in presence of non-default destructor (check the code below).

It is forbidden by the standard, since it would break existing code that predated move operations. You structured your test code in the one way the standard permits it to work, but make this slight tweak and it proves my point. Also, in your own example, just try adding virtual to that ~A destructor and see what happens - it breaks the code completely.

By the way didn't you just suggest it's a good idea to use inheritance with "protected/private delete" that was used in boost? pick your battle...

I never said it was a good idea, I said it is commonly done in practice. It is obsolete in new code, but legacy codebases can't easily make sweeping changes like that. If you want a warning to be successful, it has to be careful not to piss off the people it is meant to help, lest they simply turn it off altogether.

which suggests the memory for the string is allocated by never deleted;

Yeah, like in my own example above, where I advocated there should be a warning or error when using unique-owning smart pointers, but not otherwise.

LB-- avatar Oct 09 '25 16:10 LB--

Not sure why it feels like all your examples are just confirming my points?

It's very much an issue, because std::uniqe_ptr<Base> can be implicitly initialized by std::unique_ptr<Derived>

I never suggested there should be a warning on assignment. Casting to base like that is not just for type erasure. A lot of times it might just be a way of conforming to the standard and in all other cases other than std::unique_ptr you'll get a false positive just when conforming to the API.

make this slight tweak and it proves my point

That small tweak is not related to what I was talking about at all. The base class with "protected" destructor, just means no one can delete a pointer to the base class (not even in unique_ptr). It'll insure there won't be a bad-delete by accident since the base class can not be deleted at all. As a result the compiler can reason if the base class has a protected delete, there is no reason to generate a warning.

If you want a warning to be successful, it has to be careful not to piss off the people it is meant to help, lest they simply turn it off altogether.

Many of warnings are opt-in, this can be one of them. That people only opt-in in newer code bases.

#include <memory>
#include <string>

struct Base
{
    int x = 123;
};

struct Derived : Base // Warning should be generated here
{
    std::string y = "123";
};

int main()
{
    std::unique_ptr<Base> b = std::make_unique<Derived>(); // Not here, there are insanely many false positives here

} // No warning here either in std::~unique_ptr<Base>, the compiler can't know Derived exist if the value is coming from different CU

Gajoo avatar Oct 09 '25 20:10 Gajoo

Not sure why it feels like all your examples are just confirming my points?

Because we agree that there should be a warning/error but we disagree on when and where it should be generated.

I never suggested there should be a warning on assignment.

I did.

That small tweak is not related to what I was talking about at all. The base class with "protected" destructor, just means no one can delete a pointer to the base class (not even in unique_ptr). It'll insure there won't be a bad-delete by accident since the base class can not be deleted at all. As a result the compiler can reason if the base class has a protected delete, there is no reason to generate a warning.

Ok, but your original code also completely breaks if that ~A destructor is marked protected, which is the point I was making.

That people only opt-in in newer code bases.

But the code you are suggesting should generate a warning still needs to be written in modern C++ and is still safe.

std::unique_ptr<Base> b = std::make_unique<Derived>(); // Not here, there are insanely many false positives here

There are zero false positives because it can always be reliably detected by the compiler whether this is safe or unsafe. Both the Base and Derived classes must be complete types for this sort of conversion to compile, which means the compiler can see whether the Base destructor is virtual or not.

LB-- avatar Oct 10 '25 13:10 LB--