There was considerable mouth-dropping from publications such as The New York Times at initial reports this week that NSA programs are gathering both telephone records and information gleaned from large tech companies like Google and Microsoft. But as those reports have settled in, reactions have gotten more complex.

One intriguing line of thought came from David Simon, a Baltimore Sun crime reporter turned TV writer who created, among other projects, the acclaimed HBO show The Wire. Literally named after police surveillance tactics, The Wire largely exists as a critique of the failures of government institutions — especially the way the government investigates and responds to crime.

In a lengthy post on his own site, Simon argues that the sheer breadth of the information being collected by the NSA means that very little of it is actually being looked at; it's being put into a database to be used later in ways that will more seriously raise privacy concerns and implicate policy.

"That is tens of billions of phone calls," he argues, "and for the love of god, how many agents do you think the FBI has?"

Simon posits that what will determine whether these programs are illegal, unconstitutional, discriminatory or otherwise privacy-violating will be what happens to this data and what decisions are made about how to use it. If they abuse the information, he says, the problem will be the abuse, not the possession of the data, which is a horse both (1) out of the barn and (2) of a different color from targeted eavesdropping.

But for a lot of us, this certainly had the feeling of sharp, strange intrusiveness, and as is often the case, very real discomfort came out through semi-dark jokes like the ones NPR's Andy Carvin collected under the hashtag "#CallsTheNSAKnowsAbout."

"I rarely answer my mother's calls the first time she tries to reach me," offered one reader. "Sometimes Grandma and I have long, uncomfortable pauses," offered another. We envisioned the NSA reading our e-mails, looking at our status updates, and seeing that we haven't called the dentist like we said we would.

This was it, in the popular imagination — some supercomputer of intrusive eyeballing come to life, a combination of Skynet and HAL 9000 and the guys on Law & Order who can improve the quality of a bank surveillance video until they can make out the logo on your underwear through your pants.

But would we really care? Would the growing number of people who willingly share so much of what they do on Twitter and Facebook and Foursquare be horrified that the government could, in theory, look at a database of their phone calls? If you spend your time posting, "Here's a map showing where I am, a list of people I'm with, a description of what I'm doing, a picture out my window, a list of the companies I buy from, a list of political causes I support, three articles I just read, and my review of the movie I just saw and where I saw it," what are the odds that the existence of a database saying your phone called this other phone for 4 minutes and 19 seconds will shock your conscience?

The way we live now, we use our data as a currency. Maybe we should, maybe we shouldn't, but we do. In fact, any time you appear to be getting something for nothing, there is an excellent chance that you are paying in part with your personal information. Store loyalty cards give you discounts, which you get in return for overlooking or accepting that someone now has (or could have) a history of everything you've ever bought.

If the government assigned you a grocery card that tracked all your food purchases, it would seem like something out of a dystopian everybody-in-a-jumpsuit movie. But when your grocery store does it in return for a dollar off a bottle of salad dressing, that's an absolutely everyday occurrence for many of us. Going through toll booths is much easier and faster if you have a device on your windshield that automatically charges your account as you drive through — of course, that means there's a record of every time you drove through.

It's a mistake to imagine that all of this is hidden from view; that consumers do all this not knowing they're doing it. Many, many perfectly savvy people know very well that their information is being constantly accumulated by Verizon and Sprint, Google and Amazon and Apple and Netflix, as well as banks, stores, car dealerships, cable companies, doctors, and insurance companies. In fact, the growing problem of managing the sheer number of passwords we all have is caused by the number of entities that we know have information about us or belonging to us that we don't want just anyone to be able to access.

Many of us do this not because we don't know about it, but because there are benefits to it. Storing stuff in the cloud, to use just one example, is great for lots of reasons — you can access it from anywhere, you don't have to store it, and you don't have to worry about frying your hard drive (or burning up your file cabinet) and losing all your contacts (or all your work, or your books, or your music).

Of course, that means somebody else somewhere — actually a lot of somebodies, none of whom you've ever met, to the good faith and security practices of whom you are effectively entrusting yourself — could find out whenever they chose what books you bought, what's in your email, what you listen to, what documents you've written, whom you talk to, and what organizations you contribute to. You just have to figure they won't.

It's an economy of often queasy trust that applies to private companies the same principle Simon is applying to the government: it's probably not that big a deal, this thinking goes, that they have a massive motherlode of information, because they probably aren't going to do anything with it, and if they do, that's where the problem arises. That's what most people believe about their email, for instance — sure, the email provider could read it, but probably nobody is going to bother. We assume that we will hide in the crowd, protected by its very size.

That's true even though, on a regular basis, tales emerge of data being used in ways we haven't thought of. A 2012 New York Times story detailed how Target set out to find a statistical, data-mining method of determining whether a woman was pregnant even if she did not want them to know. We know now that companies use data about the kinds of mundane things over which we've surrendered control in order to learn the kinds of private and personal things we still want to control.

And the reason this economy of queasy trust continues to flourish is that candidly, if you look at the numbers, it's worked OK for most of us in practice. Most of the time, nobody calls you out for ordering a politically charged book for your Kindle or watching a dirty movie on demand, nobody confronts you about the comment you made in an email, and nobody comes to your house to ask why you bought wine while you're pregnant. They could, but under ordinary circumstances, they don't. If Target finds out you're pregnant, all they really want to do is send you some coupons.

In a sense, we're most comfortable with the profit motive as a reason to collect information. Amazon or Apple might know all about you, but the thing they're most likely to do with that in practice, so far, is try to sell you stuff. That's where the theory that the economy of trust in private companies can be extended to government hits one of several major snags.

We "trust" private companies to be reliably self-interested, and we don't believe there's a self-interested reason for them to read our email other than, worst-case scenario, to sell. And we further believe that their self-interest works to our advantage, because it wouldn't be good business for people to find out that Google employees (for instance) were reading and passing around people's embarrassing email on their lunch breaks. We don't really trust them to limit their use of data in our best interests, but we might trust them to limit it in their own best interests. Part of our belief that they won't do anything nefarious with the theoretically breathtaking information dump they have comes from the belief that it wouldn't make them any money.

Government has no such transparent single motive, like profit, but a variety of motives, not all of which people are confident they know about. What you believe to be the motives of a particular administration or government agency depends on a complicated, often highly charged calculus of politics, policy, media consumption, and internalized constitutional theory that you may not have even verbalized but know in your gut. (The Fourth Amendment, really, has been developed by courts but is historically rooted in our collective sense of, "They can't come in my house and do that when I didn't do anything wrong.")

Furthermore, government has powers that private companies don't have. Target and Apple can't arrest you, deport you, fine you, jail you, try you, charge you, tax you, or confiscate your stuff. What's more, they are the final arbiters of nothing; you have potential recourse in court if they wrong you. Your only recourse, should the tentative trust you place in government prove misplaced, is ... more government. Other branches, levels and segments of government. And, ultimately, your ability to vote for somebody else.

The problem is that if it proves to be true that because of these programs, the loose trust you place in Google is now extended to the government without your consent (or knowledge, until now), you don't have a choice whether you want to be David Simon or The New York Times. You can't opt out of that part. The benefits remain the same — your free email, your chat applications, your cloud storage — but the costs are different. Simon displays some actual relief that the government is using phone records and Internet information to potentially prevent crime; that's a trade-off he won't be alone in his willingness to make.

And even for those who aren't relieved, they may be grudgingly willing to accept government access to data in return for whatever security advantages it might bring along with the mundane advantages of doing business with whatever tech company or cell phone provider is at issue. But not everybody is going to feel that way. Not everybody believes that they can predict what the government will do the way they think they can predict what Apple will do.

But what does not change with our willingness to make trade-offs is the constitutional question the Times raised of whether search, seizure and privacy provisions are being violated. Constitutional provisions, in part, exist to counteract our natural tendency to make bad deals in challenging historical moments, and to protect people who don't want to make those deals from those who do. There is a value proposition in allowing both government (for security reasons) and business (for convenience reasons) to suck up data, but there was a value proposition in almost everything the constitution in its amended form was written to prevent.

There was a value proposition for at least some citizens in punishing people without trial, in denying universal suffrage, in punishing speech, and in slavery. That's why those things are in the constitution, out of the easy reach of legislatures — we choose to outlaw them even if, in a specific context, a democratically elected government decided to vote for them. If all of this is constitutional — and it's certainly not yet been found not to be — that's not because it's OK with enough of us, right now, considering what we get in return. That requires a different approach.

Copyright 2016 NPR. To see more, visit http://www.npr.org/.