For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

An Interview With Datalight - Part 2

Here I ran part 1 of an interview with Kerri McConnell and Ken Whitaker of Datalight. Among other things, they sell high-reliability filesystems. Here's part 2, where we talked about how they go about building their code.

Jack: Ken, I see from your web page that you are an active PMIR member, Project Management Professional (PMP)R certified, and a Certified ScrumMaster. Tell me about Datalight's software engineering processes. I know from an earlier discussion that agile is important to you folks.

Datalight: Yes, Jack, as important as timely project delivery is to our customers, having experience as a PMP and CSM has been a real help for me in leading our teams. I spend as much time leading our teams as I do with technical work and I must admit: I like the mix. As process and procedures is an important ingredient to get right, I prefer to look at any set of approaches that support quality and customer needs. As Datalight emphasizes top-rated technical support, it is paramount that we focus on quality assurance and customer satisfaction from the very beginning at project inception.

Historically, the prevailing perspective is that embedded software can't be developed using an agile approach. Years ago, we found that a traditional waterfall, predictive approach didn't drive home quality testing early enough in a project lifecycle and as software development tends to be more of a "learn and discover" adaptive exercise, building products in an iterative fashion just made sense.

As a result, Datalight has been working closely with customers using more of a test-driven approach with time-boxed sprints. Not only have we seen more on-time product deliveries and more time devoted to focus on actual work, it has improved my staff's motivation. With a renewed emphasis on task completion and total team collaboration, our teams are completing work with tested code every two-week sprint that we can demonstrate to customers along the way. I can't imagine building software any other way and the customers for whom we do custom engineering really appreciate the early access to code and complete transparency of project status details.

Jack: How do you folks go about testing the code?

Datalight: Everybody in engineering is focused on quality, but Datalight has always invested in an in-house, independent QA department. Our test engineers work closely with product management, technical support, and engineering to ensure that test plans and test cases cover the functionality provided by Datalight's software products.

Our validation suites are represented by a combination of functional, integrity, and performance tests. To ensure that tests are run frequently, our QA team scripts a standard set of tests that are automatically run during a nightly build process on a wide assortment of support platforms using many different toolsets.

In combination with this comprehensive set of internally-created tests, QA employs the use of industry-standard tests, static code analysis tools, and code reviews as part of our commitment to quality. To ensure that feature requests and issues are properly recorded, prioritized, and managed, Datalight utilizes sophisticated tracking systems.

Since customer support continues to be critical to Datalight's success, our QA and engineering teams allocate enough time to assist in issue escalation. Our commitment to quality and to customer support excellence resulted in Datalight being recognized with a Stevie Award in 2015.

Jack: How do you go about ensuring that your products truly are of a very high quality and essentially defect free?

Datalight: In addition to engineering unit tests, QA builds tests that are automated to enable the most efficient way to verify that code complies with a product's requirements and interface (API) definitions. Although it is very difficult for any software vendor to guarantee defect-free code, the combination of a full set of tests and building under a broad assortment of compilers, should give our customers the confidence that Datalight takes code quality seriously. Once our code is integrated into a customer's system, our tests can be run in combination with a customer's set of tests to ensure that the integrated solution works as expected.

Jack: What benefits have you seen from following MISRA and what have been the challenges? Datalight: Datalight has taken the path with our most recent product offering, Reliance Edge, to comply with guidelines specified by the MISRA C:2012 standard. This conformance has resulted in strengthening our internal processes and procedures to ensure that our code is as robust as possible. After reviewing the guidelines, we trained everyone in engineering on MISRA C:2012. Engineering mapped out a compliance matrix and identified the steps we would take to assure that the right level of conformance validation was being performed. After consulting with the Barr Group, Datalight entered into a time-intensive validation with engineering and QA staff that resulted in the adoption of new static code analysis tools from the Gimpel Software [Note from Jack: Gimpel sells PC-lint, a must-use syntax checker on steroids], adjustments to our coding standards, and improvements to our source code after performing code reviews.

The challenge has quite honestly been the commitment of time and resources to validate MISRA C:2012 compliance. The benefit is obvious: we believe we have a better product as a result of going through the compliance verification process.

Jack: What sort of tools do your software people use, what are the advantages/disadvantages of them? Datalight: Since Datalight supports a wide variety of embedded targets and flash memory technologies, we have relationships with manufacturers giving us access to all of the equipment we need. We have server systems dedicated to nightly build processing, code management, and DevOps (continuous integration) deployment, source control, and overall software development tools. Our development systems are primarily Windows and Linux based with a number of cross-compiler, simulator, and debugging tools.

All software assets are backed up locally and offsite. Among an assortment of internally-developed tools, Datalight uses industry standard tools including Bugzilla, Git, Doxygen, and Jenkins [Note from Jack: Jenkins is a continuous integration system].

Any questions or issues are tracked by our customer support team and logged with customer contacts in our Salesforce.com customer relationship management (CRM) tool. This has the distinct advantage of enabling customer-centric collaboration between our engineering organization with our sales and product management departments.

Jack: How do you convince your customers that this software is, indeed, of extremely high quality? This is a claim everyone makes. Datalight: Datalight doesn't release a product with compiler errors or warnings and runs a combination of short-term and long-term tests prior to commercial release. To ensure robustness, our QA team runs a variety systems under extreme activity loads while programmatically forcing power interruptions. To enable our customers to validate the integration of Datalight's products in their unique environment, a complete set of tests are included with every product licensed.

Jack: Do you keep metrics about the code? Datalight: Although there are different schools regarding the effectiveness of metrics, Datalight keeps a tally of SLOC (source lines of code) and ELOC (effective lines of code) and in some cases performed a cyclomatic complexity validation. Our intention is to make sure that our source code follows the standards in our internally-created Datalight Source Code Guidelines document.

Datalight's code is designed to be ANSI C portable that will build cleanly among software development tools (compilers) and target systems. Our engineering team subscribes to the following:

 Writing code as simple as possible benefits ongoing maintenance resulting in fewer defects

 Functional hierarchy should be organized for reliability and best response

 Units (functions and methods) should be short

 There should be few elements in each of these units

 Comment heavily (the more comments the better to help in code explanations the better)

 Employ code reviews as often as possible involving both software developers and test engineers

Jack: I know it's impossible to project very far ahead in this industry, but what do you see for the embedded space and Datalight's place in it in five years?

Datalight: We continue to be astounded by the growth of data-in terms of both volume and importance. The Internet of Things (IoT) is the industry buzz at the moment and the plethora of sensors out there are driving more and more autonomous systems that make decisions and take action based on the data. Clearly, it is becoming even more critical that the data be reliable-and that is our strength. Looking forward, we believe that the storage systems for critical data will need to become smarter themselves-getting better about inferring meaning from context and association of data from different sources, storing it for an appropriate access time and securing it. That's where we're focused in our longer term product development and innovation efforts.

. Jack: It would seem a natural for you to sell a package that safely handled firmware updates. Is that part of one of your products?

Datalight: Reliance Nitro and Reliance Edge both have capabilities today that make that use case easier and more foolproof than with any other file system. Our customers use the programmability of our transaction models-what we call Dynamic Transaction Point technology-to safely update systems in the field.

Thanks, Ken and Kerri, for your thoughts. I find it fascinating to talk to companies about their software engineering processes and will, from time to time, continue to post about different approaches used today.

Published July 6, 2015