Virtual memory on large dataset

Virtual memory on large dataset

I’m working with large multidimensional dataset (> 10 dimensions and > 100,000 points). One particular problem I have is the use of virtual memory (for example, via new). When I input dataset with more than 6 dimensions and 100,000 points, the program halts with the message: “virtual memory exceeded in new()”. Can you provide me with some general guidelines when working with large datasets?

The program I create uses dynamic linklist structure, a fixsize lookup table (= datasize), and creation of dynamic array in a loop (I use new and delete within the loop so that the memory can be refreshed and reused).

It’s hard to be very specific without more information, but, basically, it sounds like you are simply running out of memory.

While I don’t know how much memory each item in your dataset takes, if each one were only 1 byte (very unlikely), the number of bytes you’d need to allocate would be something like 1 with 30 zeros following. It sounded like you were even trying to allocate memory on top of that!

I know you provided more details but it wasn’t entirely clear exactly how those elements came together, and without understanding the task, I really can’t be more specific.

You need to rethink your approach; see if there are ways to use less memory; see if you can allocate only small portions of your data at a time. Either that, or move to a purely disk-based approach and wait about five or 10 years for hard disks that will handle that much memory.

Share the Post:
Heading photo, Metadata.

What is Metadata?

What is metadata? Well, It’s an odd concept to wrap your head around. Metadata is essentially the secondary layer of data that tracks details about the “regular” data. The regular

XDR solutions

The Benefits of Using XDR Solutions

Cybercriminals constantly adapt their strategies, developing newer, more powerful, and intelligent ways to attack your network. Since security professionals must innovate as well, more conventional endpoint detection solutions have evolved

AI is revolutionizing fraud detection

How AI is Revolutionizing Fraud Detection

Artificial intelligence – commonly known as AI – means a form of technology with multiple uses. As a result, it has become extremely valuable to a number of businesses across

AI innovation

Companies Leading AI Innovation in 2023

Artificial intelligence (AI) has been transforming industries and revolutionizing business operations. AI’s potential to enhance efficiency and productivity has become crucial to many businesses. As we move into 2023, several

data fivetran pricing

Fivetran Pricing Explained

One of the biggest trends of the 21st century is the massive surge in analytics. Analytics is the process of utilizing data to drive future decision-making. With so much of

kubernetes logging

Kubernetes Logging: What You Need to Know

Kubernetes from Google is one of the most popular open-source and free container management solutions made to make managing and deploying applications easier. It has a solid architecture that makes