Is it Possible for Qubes to Provide Backdoor for Some Exceptions?

sorry for a bit off-topic, but the question will be on-topic.

we know, that the purpose of privacy, is security. for example:

  • magician need to keep their magic secret in private.
  • in culinary, there is what is called secret recipe.
  • writer or producer, keep their product in private / secret, before release.
  • public speaker, keep their content in private / secret, before event.

but also, there is misuse of privacy, for example:

  • privacy related technology, maybe can be misused to do some illegal activity.

but i think all product can be misused, for example:

  • kitchen knife to chop meat, also can be misused to stab people.

therefore, doesn’t mean a product cannot be produced & utilized, due to misuse.
Because everything can be misused.

so, is it possible that Qubes, provide kind of backdoor for some exceptions ? maybe for example:

  • for government/police to monitor citizen
  • or parent to monitor kids
  • religious organization to monitor member

Hey @newbie. Interesting questions, however as it’s not Qubes-specific I’m afraid have to close it. A better place to ask may be:

We also have a category #all-around-qubes for older forum members (trust level 2+) to discuss general privacy and security questions like those.

edit: upon re-reading I notice an effort to make it Qubes-specific. However, nothing in it depends on the technology. If someone disagrees with my assessment, please do let me know.

thank you for the title :pray:

Ok, I’ll bite. Let’s see where the rabbit hole goes. Keen to discuss.

The answer is always: No. It is not possible to make a software insecure just in particular situations. It doesn’t matter what software.

You should look at Roger Dingledine’s arguments (perhaps in this video on why that’s impossible.The key argumentation is:

  1. First let all the “backdoor” interested parties discuss exactly who gets to have the keys
  2. Then we see if it’s technically possible.

The point being that there will never be a consensus as to who should get the keys, therefore the question of technical possibility does not even need to be discussed. Once you give keys to someone, others will scope creep the hell out of it and request for more and more access.

Plus, I is extremely likely if forced to ship backdoors, devs of projects like Qubes would simply stop working on it… It may be true for big business with huge financial stakes, but not here.

In conclusion, the answer is no.

1 Like

I think this was being discussed a while ago here:

Good point. For context that was way before we had the #all-around-qubes category.

First let all the “backdoor” interested parties discuss exactly who gets to have the keys

Plus, I is extremely likely if forced to ship backdoors, devs of projects like Qubes would simply stop working on it… It may be true for big business with huge financial stakes, but not here.

I like your argument.

but what if someone with money and power, donate a big amount for the projects and all devs, or maybe secret crowd funding / donation, to ship backdoor, to monitor someone, who has less meaning for the project ? who struggle daily for his right, privacy & security ?

maybe, not sure if it can be an example or not,
US government donate big amount to spy Snowden ?

That isn’t convincing as a general argument, since it assumes
that step 1 will never result in a consensus. A moment’s thought tells you
that is nonsense. (I haven’t listened to the YouTube piece so don’t know
if you have summarised the argument correctly.)
So, since 1. can be passed, we move to step 2. And yes, it’s obviously
technically possible in some cases, and perhaps possible in the case of

Could it happen? Probably not.
That depends on trust in the developers, review of the source code,
and trust in the build process. You can help yourself by following the
guides on checking install media, and so on: that rules out the
possibility of external actors introducing backdoors.

1 Like

Technology is made by humans.

Humans are inherently vulnerable to pressure / force.

There are some academics talking about “zero-trust” things, that I don’t
fully understand (yet). However in the real world when talking about
security sooner or later you reach a point where you have to place some

  • Trust in software/developers
  • Trust in hardware/vendors
  • Trust in cryptographic algorithms
  • Trust in key storage
  • Trust in physical security

To make it worse: even if for some reason you decide you can trust all
of the above 100% now you only talk about intentions and known things.
There are still humans involved and they make mistakes, which can be
abused by other malicious (maybe smarter) humans.

==> there is no absolute security

For the particular use case of running Qubes OS:

  • you can decide to trust the core team
  • you can review the code (if you understand it and have the time)
  • you can install CoreBoot/Heads and disable the ME
  • you can inspect your hardware for modifications (if you have the
  • you can physically keep your setup on you or locked away

…even if you did all of that, it still doesn’t guarantee the absence
of Xen/Qubes zero-days being abused by governments.

==> there is no absolute security

All you can do is look at all the threats and what you can do to
mitigate them – not just software and hardware, but most importantly
also you: the user. And then start on Plan B … what to do if you got
compromised anyway.

1 Like

Make sense, some things are beyond our knowledge and capacity,
then we need to delegate and trust.

I think the backdoor key, is the same as the ring, in the sequel Lord of the ring,
everyone are good guy, with good intention, until they have the ring.

Thank you, I will try

What do you mean by “possible”?

Here are some different senses of the word you might mean:

  • Technically possible, i.e., does the current or future state of human technological progress allow for its implementation?
  • Epistemically possible, i.e., is it already happening, and we just don’t know it?
  • Likely, i.e., how high is the likelihood that the Qubes OS Project or the team behind it will do one or more of these things?

No to any and all “back doors” Unacceptable!!
I have heard arguments about topics like this. The user usually has a statement like.
-(user)…what if i forgot my password… (me)O’well reimage and go from backup.
-(user)…i forgot to backup or I haven’t backuped in awhile or i don’t know how to do that…
(me)…well maybe you shouodn’t be using a computer let alone any technology

I believe if this were to happen Qubes-OS would have a massive fallout.

maybe, “possible” is not a good term for the question, sorry for multi interpretation.
but I think you gave good interpretation and example already. I think all of those.

Ok, then. Let’s look at each one:

  • Technically possible, i.e., does the current or future state of human technological progress allow for its implementation?

Yes, of course. There are many documented cases of backdoored software in the history of computing.
Just like almost any OS, there is nothing about the nature of software or computation that would prevent the insertion of a backdoor into Qubes OS. It’s just a matter of adding the code.

So, this technical question is somewhat trivial and probably not what you care about. Instead, you probably care more about questions like these:

  • How likely it is that such a backdoor would be added? For example, how would it get inserted? Considering our extremely rigorous process of code security review and code signing checks compared to the vast majority of other software projects, it seems unlikely that a contributor would be able to slip one in, and (as explained below), I think it’s also exceedingly unlikely that a core developer would try to slip one in.
  • How likely is it that such a backdoor would go unnoticed? For example, if it gets noticed before it ever makes it into a release, then it never affects any users. This depends, among other things, on how many eyes are on the code (more on this below).
  • Epistemically possible, i.e., is it already happening, and we just don’t know it?

It depends on your epistemic position. Speaking only for myself, I would say the odds are very low, but of course nothing’s impossible. I have a semi-insider’s view (i.e., I’m privy to more of the internal operations than the general public, but there are still many things to which I’m not privy). From this point of view, I have seen zero indication that anyone on the team wants to or is being pressured to do anything like this. (Of course, it’s always possible that I’m the target of an elaborate deception and that I’m intentionally being fed disinformation or something, but I haven’t noticed any signs of anything like that so far.)

The fact that the code is completely open-source helps here, and it will help even more once ISOs are built reproducibly. This will allow people who scrutinize the source code to build an ISO from the source code they’ve scrutinized and confirm that it’s bitwise identical to the ISO distributed by the core devs. That means the rest of us who lack the skill to understand the source code can benefit from their scrutiny. On the other hand, none of this matters if no one outside of the project looks at the source code, which is why we all very much want a lot of eyes on the code. (As a user, I want this for the security benefits, and as a representative of the project, I want this to bolster the evidence of our trustworthiness.)

  • Likely, i.e., how high is the likelihood that the Qubes OS Project or the team behind it will do one or more of these things?

IMO, extremely low, for the reasons already given above, but again, not impossible. From what I know of the Qubes devs, none of them would ever want to do anything like this. They are nice, normal folks with good motives. However, we can’t rule out the possibility of exogenous forces like court orders, government agency operations, bribery, coercion, etc. I think most (if not all) of us on the team just want to lead quiet, peaceful lives while working on things we find important and interesting, and we sincerely hope the things I just listed stay far away from this project and the people involved in it. We have canaries, which provide some degree of assurance against certain kinds of exogenous interference into the project, but they’re not infallible. The more extreme the scenario you imagine, the less likely canaries or any other measure will be sufficient to guard against it. When thinking about these matters, I think it’s useful to consider several questions:

  • Many bad things could happen, but what would the motivation be for the people doing them? What would the costs be, and how would they justify those costs (including opportunity costs)? What risks would they be taking, and what would they stand to gain?
  • If you’re worried about some specific type of extreme negative scenario with respect to this project, how does it compare to other projects or companies? Could it also happen at other places? Is the likelihood higher or lower at the other places? Why?

I’d like to add that if you’re worried about extreme scenarios with regards to the Qubes OS code, you must also take into account hardware/below-ring-zero issues, like implants or Intel ME/AMD PSP–as noted in another discussion here, those provide much more bang for the buck and would involve far fewer co-conspirators, and there’s nothing the vast majority of people, experts included, can do about them.

Also, given the relatively high status Qubes seems to have in the infosec world, I believe a lot of people with infosec backgrounds use and scrutinize this piece of software–finding a backdoor or a severe vulnerability in Qubes/Xen would be a noteworthy career achievement, IMO (I’m not in infosec), and would be newsworthy in the industry (especially an intentional backdoor). This assumes that there are a non-trivial amount of auditors, and that many of them have more to gain by exposing rather than stashing/selling their finding.

Also note that code obfuscation is a thing–it is possible to write malicious code in a way that’s time-consuming and frustrating for humans to unpack, or to have the ‘pieces’ of the backdoor in disparate pieces of the code, with each looking innoculous.


Not technical; consume with salt; corrections appreciated.

1 Like

I also think finding and presenting something like full exploitation chain from browser to Xen would be huge achievement.

Come to think about it, a backdoor being found would probably spell the end of Qubes, permanently, so whoever wants a functioning backdoor actually has a large incentive to keep the it extremely well hidden, lest the entire investment vanish. This is probably why out-and-out backdoors are so rarely found in OS-level software, AFAIK.

There are many ways to go about this–the most obvious way would be to disguise it as a bunch of vulnerabilities introduced over a period of time by separate ‘contributors’, so there is plausible deniability. A skilled mastermind might be able to create ‘roadmaps’ to follow, so different paths can be charted through a constellation of vulnerabilities according to which ones have been discovered and which are to be introduced.

Of course, this is all conjecture (half-baked and haphazardly articulated, to boot). Regardless, I’m definitely not pleasing the tinfoil hat crowd with this.


Not technical; consume with salt; corrections appreciated.

Would it also have to be highly targeted? I’m not sure how popular Xen is in the cloud/VM world.

I was thinking about something like this: you have default Qubes installation, default fedora-33 appvm firefox, go to website and boom, your dom0 is owned.

About popularity of Xen, I don’t really know. I have a feeling it was much more popular years ago, now KVM is getting more prevalent. But I don’t have any numbers, just a feeling.

I remember reading that side-channel attacks like Spectre and Rowhammer can theoretically be exploited via Javascript–wouldn’t a Xen exploit chain that starts from the browser be trivial as long as you have all the parts? The hardest part to acquire would be the Xen exploit, if what I hear is to be trusted.