A key factor behind the massive problems and delays Monday in the Iowa Caucuses was the failure of an app used by the Iowa Democratic Party that was meant to make reporting results faster and easier. It ended up doing neither.

Kristin Simonini, from Framingham-based crowd testing company Applause, spoke with WGBH News' All Things Considered anchor Arun Rath about what could have been done differently with the party's app. This transcript has been edited for clarity.

Arun Rath: I should probably say thank you for giving us a way to talk about the results in Iowa.

Kristin Simonini: Yes, absolutely, though this is not how any of us would want to do that, I suppose.

Rath: I think definitely not. But this is interesting. It seems that one of the problems was that the Iowa Democratic Party didn't release the technical details of the app. That's something that I've heard bandied about. Can you explain why that would be a big deal?

Simonini: I think that basically it sounds like the things were left to the end of the wire before it was made aware that there was this app that was going to be utilized. And, given the timeline that I think I've seen quoted, they probably got to the endgame and hadn't finished their development, and didn't leave enough time for ample testing and then found themselves in this situation.

Rath: I could see why a company or entity wouldn't want to release technical details for trade secrets or other reasons. Why would that have helped us understand what's going on?

Simonini: What we see everyday with our customers is that it's incredibly important to test with real end-users, because you can't predict those behaviors, first of all, from how the user is going to interact with the app in question. But even more so in this case, testing it across different devices, different locations, networks, bandwidth. There's a lot of variables in that that can surface and you can't anticipate that, nor can you effectively reproduce that in a lab. And we work with customers all the time that may have pre-production proprietary applications, but they still utilize this crowd testing methodology to support that and keep their secrecy if they need to, for something that's not really ready for prime time, but also ensure that quality is what they want it to be at production.

Rath: You mentioned this going down to the wire — and we saw from reports that people responsible for reporting results with the app — were only given access to it on Friday and complaints that there wasn't enough training. Is that fair?

Simonini: Oh, yes. I mean, certainly if the audience in question didn't know how to use the app and didn't have enough time to ramp up on it, that would certainly be a problem. And from everything I read, the problem that they did find was found relatively quickly. So, in other words, if they had done a proper set of testing, one would think that they would have been able to find, identify and fix that before it got in the hands of its end users.

Rath: "The Wall Street Journal" reports that this app comes from a company, ironically, named Shadow. Do you happen to know anything about this company? I don't suppose you've tested apps for Shadow.

Simonini: I am not familiar with that company, no. But it does, highlight another item, which is when you've got a company doing development, having another company responsible for the testing is also a very good practice so that you ensure that you are getting the right eyes on it and there's the right checks and balances in place for that quality.

Rath: Kristin, you study how apps work, why apps work. Given all the warnings we've heard over the last four years about Russian hacking, is using an app a safe way to convey election results, or is it a bad idea?

Simonini: Well, I think there's quality testing and there's also security testing. And there are ways by which people can protect their solutions and put in the right protocols and measures to ensure that that security is there. So, I don't think that it's something we should be turning our back on. I think technology can be very powerful. But I think it just highlights that there are measures that need to be taken, particularly of things of a high degree of confidence and security that just need to be tested for.