[Rectified] Possible memory leak

I’m testing large object creation and removing with the following test code. This is needed for my usage purpose. I have found memory grow rapidly after some times.

Here is the test code. I added inside the HandleUpdate of the example 4.

int counter = 0;
Node* mushroomNodeTest [1000];
double obj = 0;
void StaticScene::HandleUpdate(StringHash eventType, VariantMap& eventData)
	if (counter == 1)
		for (int i = 0; i < 1000; ++i)
			mushroomNodeTest[i] = scene_->CreateChild("Mushroom");
			mushroomNodeTest[i]->SetPosition(Vector3(Random(90.0f) - 45.0f, 0.0f, Random(90.0f) - 45.0f));
			mushroomNodeTest[i]->SetRotation(Quaternion(0.0f, Random(360.0f), 0.0f));
			mushroomNodeTest[i]->SetScale(0.5f + Random(2.0f));
			StaticModel* mushroomObject = mushroomNodeTest[i]->CreateComponent<StaticModel>();

	if (counter == 2)
		if (mushroomNodeTest)
			for (int i = 0; i < 1000; ++i)
		counter = 0;

    using namespace Update;

    // Take the frame time step, which is stored as a float
    float timeStep = eventData[P_TIMESTEP].GetFloat();

    // Move the camera, scale movement with time step

Try this after remove all nodes:

// Release all unused resources GetSubsystem<ResourceCache>()->ReleaseAllResources(false);

Hi, I tried both debug and release mode, with both true/false setting.

That doesn’t seems to be working.

Best Regards

It doesn’t seem that this is some memory leak in code. MSVC reports nothing and Urho itself is unlikely to have a leak in such generic mechanism.

I suppose that you are just doing wrong things in general that lead to memory fragmentation and growing of used memory amount. Probably Urho need more pools, but it won’t be made in near future.

What do you mean? Have you test the function I posted?

This is the function inside the default example 4. I just added in 2 for loops and three globals to demonstrate the problem.

After creating and removing 8 million nodes with static model components your ram usage will be at 1GB.

Do you think that bit of the code I posted is wrongly written?


Yes, I reproduced memory growing. But I think that it is not a memory leak, but memory fragmentation caused by allocating tons of small objects: every node and every StaticModel internally cause many allocations. I think so because this problem is invisible for leak analyzers and there is no evidence except task manager that is not very reliable.

Redundant allocating is a problem of Urho, is some sense. I’d like to look at it at some point.
However, your problem is caused by your code. Nodes and components are not designed to be constantly created and destroyed, so the best solution is to avoid it.

BTW, are you able to crash app with out-of-memory?

I haven’t able to crash the app yet ^^.

I’m creating a discrete event sim. The method I’m using required to create and destroy lots of objects. That’s why I’m stress test this. I want to run at least 50 years at the rate of 1 node created and destroy per simulation second.

I have created 8.65 milliions objects in my code and the memory turns out mostly due do create and destroy node with static model on Urho3D engine. That was running over 100 days simulation in less than 3 minutes. My target was 1 billion object. But I think I will run out of ram at that rate.

Let me see if I can crash my app :).

Tested 100 million create and delete on my application code. The memory usage seems to be stable at 1.40 Gb.


Thank you! It confirms my guess. Stabling means that it is an issue about probably memory fragmentation or memory management in Windows, not leak in Urho.

However, Urho needs some pools, IMO.

1 Like

Indeed! Also more orientation on Data Driven

I tested George’s code and verified that the memory usage increases due to fragmentation but no memory leak. On Win10, VS2013 debug - took 25 mins. to settle and fragmented 1/2 GB just on a simple scene with his code.

I am curious as to how @Dave82 's Infested game is holding up when transitioning from indoor to outdoor and vice versa frequently.

It works without any problem so far.I didn’t encountered any memory growth in frequent removing/loading resources.I tested it a lot and after a certain time the memory usage stops growing.Sometimes even goes down drastically even if i load a new resource (possibly a bigger chunk of memory is released by the OS than the new resource).

Thanks for the message.
As per my last message, this question should be marked as solved for now.

I have further test my simulation framework with 500 millions create and delete components and nodes (Running for quite a while). The memory became stable at 1.4GB on my system.


I know that 1k creation and removal per frame test is a bit extreme but good to know that a real game doesn’t see this problem.