Report: Code Quality Improves for Open Source Big Data Tools

Report: Code Quality Improves for Open Source Big Data Tools

Coverity has released its annual report on code quality, and this year’s study holds good news for open source big data tools. Among the open source big data tools examined, 62 percent had defect density rates (DDRs) below 2.72, which is the average DDR for Java projects. Big data tools making big improvements over 2013 included Hadoop, which improved from a 1.71 DDR in 2013 to 1.67 in 2014; HBase, which fell from 2.33 to 2.22, despite adding 200,000 lines of code; and Cassandra, which decreased from 1.95 to 1.61.

“Early efforts of the big data projects tracked by Coverity Scan are showing interesting and actionable results,” said Zack Samocha, director of marketing for Coverity. “IoT and big data have the power to transform lives and our economy. There?s a great deal riding on these foundational technologies, and these organizations are taking that responsibility seriously. It?s encouraging to see their commitment to addressing critical defects and to taking the appropriate steps to deliver higher quality software to the market.”

View article

Share the Post:
Heading photo, Metadata.

What is Metadata?

What is metadata? Well, It’s an odd concept to wrap your head around. Metadata is essentially the secondary layer of data that tracks details about the “regular” data. The regular

XDR solutions

The Benefits of Using XDR Solutions

Cybercriminals constantly adapt their strategies, developing newer, more powerful, and intelligent ways to attack your network. Since security professionals must innovate as well, more conventional endpoint detection solutions have evolved

AI is revolutionizing fraud detection

How AI is Revolutionizing Fraud Detection

Artificial intelligence – commonly known as AI – means a form of technology with multiple uses. As a result, it has become extremely valuable to a number of businesses across