Julian Assange: Debian Is Owned By The NSA
This looks like FUD to me, and something of a smear campaign against free/libre software. Maybe people try to introduce security vulnerabilities to Debian, but there's no reason to believe that the Debian developers are involved in some sort of big conspiracy to covertly get security vulnerabilities in Debian. And I'm sure the NSA is more interested in using the more reliable method of proprietary software to add security vulnerabilities and backdoors. Why spend more effort with a lower success rate trying to weaken Debian, when they can do it to Windows so much more easily, especially considering that Windows is still the most popular OS?
@opon4: I suggest you watch this video from the BSD guy at FOSDEM:
http://video.fosdem.org/2014/Janson/Sunday/NSA_operation_ORCHESTRA_Annual_Status_Report.webm
Then think about the more recent OpenSSL bugs....
Debian might not be "owned" in the sense that they belong to the NSA but there are a lot of possible backdoors in the most vulnerable parts of the eco system of GNU Linux and there is nothing we can do about it (e.g. crappy written code).
For closed source OS this is of course worse but GNU Linux is far from being so safe as a lot of people might think.
Thanks for the link. I never knew Poul-Henning Kamp exists. But he has a clear mind. And an admirable focus power. I mean the questions were asked by complete morons. No wonder OpenSSL is a mess, a painter would be more qualified to do C code than a CS major with ethics and understanding the level of Marvel comics.
Sometimes I feel is degrading to have IT guys and theologians qualify for a University degree when they are less qualified than a stone mason.
Haven't watched the video yet, so I have no idea what Assange is saying, but the wordpress.com guy is a typical basket case. Conspiracy theory all the way - almost entirely noise.
Do the big corporate entities, in control of the US government, have an interest in infiltrating and subverting every Free Software-related organization? (Most definitely, they do.)
Does the US government have a habit of infiltrating and sabotaging organizations opposed to the corporate interests that it serves? (Yes, it does.)
Has the US government infiltrated any main/big Free Software-related organization, with the above-mentioned purposes? (I don't know... But, I most definitely wouldn't be surprised to learn that... And, the known facts, revealed in that article, point to that suspicion...)
For this reasons, I'm not surprised to read (and listen to) this, concerning Debian...
And, I already didn't trust any corporate GNU/Linux distributions. (Including: "Red Hat", that has deep ties with the US government; and "Ubuntu", that even uses the same design as the British equivalent of the NSA.)
And, since I don't personally know any of the people responsible for other smaller distros, that have "benevolent dictators for life", in front of them, I don't know if I can trust them, or not.
For these reasons, only the distributions made by non-profit organizations, that have a clear and transparent pro-Free Software policy, are the ones that I put my confidence in. Although, it's never a /total/ confidence. Since, I know (from experience) how every activist/progressive organization can be infiltrated and subverted, to different extends (and, how even some of the "alternative" organizations are, in reality, traps, built by the system itself, in order to catch the more naïve).
Thank you for sharing that most interesting piece of news, bitbit.
"And, since I don't personally know any of the people responsible for other smaller distros, that have "benevolent dictators for life", in front of them, I don't know if I can trust them, or not."
You mean like trisquel?
"For these reasons, only the distributions made by non-profit organizations, that have a clear and transparent pro-Free Software policy, are the ones that I put my confidence in. "
You mean like debian?
"You mean like debian?"
No - They said one with a "pro-Free Software policy." That precludes the Debian Project (note I'm referring to the Project, not the distro.) The Project's policies at best neutral, seeing no problem with their free and non-free stuff sitting side by side.
Actually, I do include Debian among those non-profit organizations with a pro-Free Software policy...
Since, I can understand the (practical) need for a proprietary repository, for those people who really need to use non-free programs. And, I think it's better to have one such repository that is maintained by a pro-Free Software organization, than a corporate one.
(One organization doesn't have to be "pure", in that aspect, to be included in my such label/description...)
Although I, obviously, prefer the non-profit organizations that distribute /only/ Free Software - and consider them to be the "champions" of pro-Free Software policies - I do believe in the sincere effort of the people at Debian, in trying to make people come to the Free Software side, by serving as a "bridge" for those who, for various reasons, can't afford to use only Free Software.
(As far as I know, Debian appeared at a time when it was not even possible - or almost - to have a home computer working decently, without the use of proprietary drivers and/or programs... And, it was by starting to use Debian-derived distros that I eventually got to Trisquel.)
The people at Debain did, eventually - when they could(?) - rid the Linux kernel of proprietary blobs, and continue to make a very clear distinction between free and proprietary software, by forcing people to activate a separate repository, if they really want to use proprietary programs (and, therefore, forcing them to be completely aware of such decision and distinction).
It was even on the Debian project's web pages that I learned what Free Software was all about, and got to read the GNU GPL.
As I said in here, previously, (https://trisquel.info/en/forum/how-many-people-are-working-trisquel-which-libre-linux-distro-has-most-developers#comment-42517) I believe their decision to have a proprietary repository to be one that is forced by practical aspects (even though it goes against their intentions) and, not one made out of indifference, for the whole idea of Free Software.
When I mentioned "smaller" distros, I meant the *small* ones, that are maintained by half-a-dozen people, or so - who reveal very little about themselves.
And, what I meant, overall, was that:
- I don't trust such "small" distributions.
- And, concerning the "big" ones, I don't trust the ones that have corporations behind them.
Leaving only the "big" ones that are made/maintained by well-known non-profit (serious) organizations the ones that I'm willing to try.
(Trisquel, with its large number of volunteers, involved in it, is "big" enough, to fit my description.)
This doesn't make any sense at all;
trisquel has exactly one developer.
It's the prime example of a small distro.
I talked about the number of people who "maintain" a distribution...
And, there are a lot of people involved in Trisquel, besides the person responsible for the software itself.
There are many libraries and programs which also have only one person responsible for them, in its origin. And, I could never know every one of those same persons...
The trust is never absolute.
But, with a large number of (clearly) well-intended people involved in a distribution, that trust can be much higher than in other situations.
"I talked about the number of people who "maintain" a distribution...
And, there are a lot of people involved in Trisquel, besides the person responsible for the software itself."
So what exactly is the benefit of a bunch of people submitting bugs and giving support in a forum?
If you increase the number of those "maintainers" the amount of trust you will have to put into the main developer remains the same.
I think you're trying to rationalize somehow why trisquel should not be a small distro;
most of the common "small distros" have many people involved just like trisquel.
Maybe you mean something different;
Knowing Rubén personally, I trust him. Granted you don't know me so take this for what it's worth.
Even if you trust Ruben completely, Trisquel does not rebuild many of the packages from Ubuntu.
For example, apt-cache show openssl gives a checksum for the latest deb of 843af273766056f25469e9915ddd2567. Compare that with what's here.
So you have to trust upstream as well.
Trisquel recompiles (aka rebuilds) all the packages from source on Trisquel servers. You can tell this because all the packages are signed with just the Trisquel key and 'apt-key list' only has the Trisquel key showing.
Yes Trisquel has to trust that upstream hasn't put anything in the publicly available source. But that also means it's checkable, which anyone so concerned is welcome to do.
No they don't. Look at the checksums.
It looks like some files match like http://packages.ubuntu.com/precise-updates/amd64/python3/download and http://packages.trisquel.info/toutatis-updates/amd64/python3/download
but then http://packages.ubuntu.com/precise-updates/amd64/python3.2/download and http://packages.trisquel.info/toutatis-updates/amd64/python3.2/download do not
Run "apt-cache show python3.2"
Result: same checksums. The packages.trisquel.info site is messed up, I reported an issue a while ago. (Some of trisquel's own packages don't even show up!)
I don't know if what you're asserting is true, but if I may, I'd like to point out that checksums show the contents of a file, not who made the file. If Canonical and the Trisquel team use the exact same method to compile a program (not incredibly unlikely), the checksums will be the same for both.
The openssl package contains a build date/time. (I don't know if all deb packages do, but this one definitely does). So it should differ if Trisquel rebuilds it.
After Reading some more of The 'Fine' Manuals, including Debian's docs then here:
https://trisquel.info/en/wiki/how-trisquel-made
I stand corrected, many binary packages are copied verbatim - so Thank You.
Of course, the checksums being the same is of itself not enough to go on. A fully self compiling one person distro would conceivably write a script to fix the file timestamps etc in their own version of the .deb so a compare against the original can be done and a lot of compilation log surfing avoided. Which would seem to be what's needed here.
"So you have to trust upstream as well."
No you don't. Source code is still available and the package can be modified and recompiled if necessary.
Do you rebuild all source packages? Do you trust Ubuntu's binary packages?
Try looking at the checksums of some binary packages that Trisquel hasn't modified. Compare with Ubuntu.
I am merely saying that, with source code availability, Trisquel has the ability to.
RMS has mentioned this before in some of his talks. Free software is not a perfect solution from a privacy standpoint since the entire GNU/Linux system is so complex as it still requires eyes on all packages all the time which is hard. It requires some level of trust somewhere in compiled packages, but it sure beats non-free software. It really is the best anyone can do.
At the end of the day you could compile the entire system yourself, but it's still impossible for you alone to know what is in the public source since you're just one person. Therefore we have to trust that somebody, somewhere, in some project, if they find a bug, or security hole, etc that they tell someone and fix it so that you can them know. Again it's not perfect, but what else can you do?
:)
I don't necessarily have to know someone "personally", in order to trust him, or her. I just have to know him, or her, well enough - as I would, if I had the opportunity to relate with such a person. (1)
And, alternatively, one can reveal (very much) about one's true nature, just by the nature of one's work. (2)
Meaning that...
(1) People like Richard Stallman, for example, reveal a lot about themselves, in the public talks they give - which are a very good way to know what kind of persons they are.
And, (2) one very good way to judge a person, is by the nature of the work s/he does - through which we can infer that person's (true) intentions. (Like in the saying, "judge a tree by its fruits".)
In late 2010 OpenBSD experienced something similar.
It is interesting to look at the details of that case. Awful lot of connections but apparently nothing big came of it or so they all say.
http://marc.info/?l=openbsd-tech&m=129236621626462&w=2
A few days later...
http://lwn.net/Articles/420858/
I'm not sure what I believe, but I can say that this type of thing introduces distrust which takes more time and energy to overcome or to verify.
Couldn't help myself noticing the irony. So we believe in a conspiracy. And everybody is out to get the poor user. And Google is evil. And NSA protects Google or the other way around. So we make a video. We're going to distribute it through a Google service, a service that watermarks the files and in closed formats. With such followers Assange is a dead man living on stolen time.
Google is a front for the CIA/NSA. (http://www.infowars.com/group-calls-for-hearings-into-googles-ties-to-cia-and-nsa/)
And, it repeatedly censors people who use its services - including myself. (http://blackfernando.blogspot.pt/2013/03/como-o-youtube-censura-os-meus.html)
The reason why many people, who are aware of this, still use its services, is because they're still the services used by most people who are not aware of all this. (And so, if you post a video on YouTube, it will surely be seen by many more people, than if it was posted elsewhere.)
But, yes... This is all revealing to be counterproductive, to the powers-that-be. Being that the reason why the Internet, itself, (at least, as we know it) won't last for long... (https://trisquel.info/en/forum/internet-censorship-authoritarian-countries#comment-30744)
Infowars is not a reputable source :/
You're not a reputable source. :/
I'm not wasting time on this so here http://conspiracies.skepticproject.com/articles/alex-jones/
And, I'm not even going to waste my time debunking you, so here:
If Debian is compromised by the NSA (I highly doubt that this is true) it can be helped;
developers all over the world can check the code over and over again and with a huge effort we can lock nsa out again;
that's the difference to non-free software;
so this is no argument against free software;
@quantumgravity: I have no doubt that Debian is "compromised" in the pretty much same way as every GNU Linux distribution is compromised as well. Due to the sheer amount of code for the Kernel itself, all applications used and all the libs included it is close to impossible to validate code for possible implemented weaknesses / back doors.
Also lets not forget that are areas with a high complexity are not really understandable for most devs. This applies to critical areas like lib OpenSSL and possibly a lot more.
Of course this is no argument against free software. No-one said so.
This shouldn't even be a thread. It is just a information on a blog that shoving words into Julian's mouth.
Here's the link for the video.
https://www.youtube.com/watch?v=UFFTYRWB0Tk&t=20m
He just uses an example from Debian's bug to illustrate a point about backdoors that are disguised a bugs.
Should we be checking Trisquel? Yes but jumping to conclusions is illogical. Sort of like this(from the blog):
"Assange mentions how Debian famously botched the SSH random number generator for years (which was clearly sabotaged – a known fact"
He provides no sources. There is no way to tell if a program was purposely sabotaged but with auditing and open-source we can check and fix those problems.
Please no more of giving these nuts that'll use their own blogs as a soapbox on this forum. It's just FUD.
Darksoul71: "Due to the sheer amount of code for the Kernel itself, all applications used and all the libs included it is close to impossible to validate code for possible implemented weaknesses / back doors."
Yes. It seems many eyes saw the Debian bug, but that was not the end of it, it still became a massive mess in spite of the "awareness".
http://marc.info/?l=openssl-dev&m=114651085826293&w=2
Mr. Roeckx was known. http://pgp.mit.edu:11371/pks/lookup?op=vindex&search=0x41DC1C907244970B and later http://pgp.mit.edu:11371/pks/lookup?op=vindex&search=0x2064C53641C25E5D
There is the idea that transparency or openness offers an automatic type purity or security to the code so much better than proprietary software, and that may be true...or should be true. However when errors are missed or minimized because of the assumption that so many eyes see the code that "someone else" will look into it, that is a problem. If too many people think along those lines, "nobody" is looking at the code, you know.
To all:
The intellectual discourse on this issue has certainly been interesting, shady, with lots of negative political drama and black ops worth reading.
But, what on earth do we do from here?
What are the solutions without compromising and affecting deeper all the GNU/LINUX OSI source code, farther WHO DO WE TRUST?
It's really just a matter of transparency and organizing things to optimize that. You cannot have perfection.
jodiendo: "But, what on earth do we do from here?
What are the solutions without compromising and affecting deeper all the GNU/LINUX OSI source code, farther WHO DO WE TRUST?"
As an illustration, imagine a shantytown built by "good" people, not designed by architects and not built by licensed contractors. No blueprints, no building codes, no inspectors. The shelters "work" but are not ideal.
Not saying that existing code is a shantytown, however didn't most of it "grow" into being rather than being planned and designed as a "whole" GNU/Linux system with exacting coding standards and oversight along the way?
An intern is saying he caused the heartbleed exploit. http://www.forbes.com/sites/kashmirhill/2014/04/10/whats-really-scary-about-heartbleed/ Look at the effect of that goof. Given the project budget and manpower, they accomplished a lot, sadly including showcasing the fragility of web security.
When you see software or patches uploaded by "cloudchild" or "starlord" or whatever, should we be comfortable with that?
A corner has been turned. It's going to be harder now, and perhaps less fun, and less innocent.
Maybe it is time for a formal community of code auditors and reviewers to be created. Piece by piece, step by step, every line checked, impossible as it seems.
Trisq
Thank You for that accurate explanation,
Like you said "A corner has been turned. It's going to be harder now, and perhaps less fun, and less innocent.
We must be more diligent and savvy, but I do ask this questions? HOW? When? What? Where? to start identifying the trust worthy?
This issue has become the Hydra and specially, "this old can of worms" we don't want to be transmitting securely with.
Thank You again.
jodiendo,
I read somewhere (who knows if it is true) that one US government agency had ~1,000 full time analysts who look for bad code to be used for potential exploits. Assuming other countries have something similar, in numbers alone, the free (and open) software movements would need similar thousands--full time equivalent--doing similar work and making fixes, rather than cataloging and saving exploits for later.
This approach does not involve "trust", which is what you asked about, but may well work better than trust. Just use plain old "overwhelm" and make the opposition's work more and more expensive and more and more difficult to justify and sustain.
The problem with trust is that while it ought to be "earned", the world is full of good guys and bad guys and corruptions within both groups who do not make the accounting easy or fair.
Ideally "they" want to dispense trust or safety or privacy; to bless it, to control it, which in turn gives them an indirect control and status over all people who "need" to believe in those issues. And many people are ok with, and even welcome, a scenario of "letting someone else do it", so that kind of trust system works, but doesn't really deliver what it promises.
It takes "work" to accept and deal with trust issues, poor security, and the loss of privacy. Making the work harder is that there is no end point; the work is never finished, it is an ongoing process as they say...because good guys and bad guys keep figuring out new methods to accommodate their ends and needs.
Personally I am still waking up to this reality and have difficulty accepting it. I have noticed it however and that is a start.
There has been a compelling myth that freely readable software code has many eyes constantly checking and improving it. That myth must die today and no longer be recognized as "real". From now on, it needs to become real.
How to develop a real system of code checking that overwhelms the opposition would be helpful. How to arrange that? At this time, I do not know. There must be a way however.
Well the article OP linked is very questionable, if not at least misleading with regard to what Julian Assange actually said. Go here for what he actually said: https://www.youtube.com/watch?v=UFFTYRWB0Tk#t=1222
And his statement is essentially the same as FSF's: https://www.fsf.org/news/free-software-foundation-statement-on-heartbleed-vulnerability
The author of the article (IgnorantGuru) is free to have their opinion on how corrupted GNU/Linux developers are, but Julian Assange never said nor even implied that Debian is either owned by or in a conspiracy with the NSA. It is solely the author's conclusion.
As to what Julian Assange said, he mentioned that Debian's package system design (where things depend on each other) is somehow an insecure design because all it takes is compromising one or a few packages. Does he know what he is talking about, or is there an apparently more secure design approach to operating systems? One thing I would like to see (which should help) is a deterministic build system as mentioned at LibrePlanet 2014.
Also: author seems to think Red Hat rules the GNU/Linux universe. What?
FWIW Debian had already backported security fixes for the vulnerable versions of OpenSSL before heartbleed became public. When it comes to security, Debian is the last OS you should really worry about. :/
I'm not sure why it took this long to sound the alarm to everyone else, and I'm not sure why system admins would fail to keep such a crucial bit of software updated (if you had kept it updated, it would have been fixed). But if you run Debian or even a Debian-like system, you can benefit from instant security fixes with unattended-upgrades.
You can complain about the monolithic Linux kernel though, whose design is not ideal for security. Microkernels are better, and this is why I look forward to the HURD kernel (a collection of servers running on the MACH microkernel). https://www.gnu.org/software/hurd/
laigualdad at riseup.net wrote:
> As to what Julian Assange said, he mentioned that Debian's package system
> design (where things depend on each other) is somehow an insecure design
> because all it takes is compromising one or a few packages. Does he know what
> he is talking about, or is there an apparently more secure design approach to
> operating systems?
Let's review what he said. I watched the video and typed in this
transcript from what I heard him say.
===========================================
Questioner: Now, one of the big topics here is open source and I'm
wondering whether the fact that you have an open system where everyone
knows how it works would make encryption more secure than a closed system?
Assange: We know from experience that it does seem to be the case; that
there's a vast number of closed source snake oil encryption systems
being spread around. Now we know that open source is not entirely a
solution. For example, there was an encryption bug in Debian's version
of SSH in the random number generator which existed for years and that
was all open source. Now it was eventually found and revealed also
because it is open source. But the way things are done now is bug doors:
these are back doors designed to look like bugs. And what is the
security of the programmers who are involved in some of these open
source systems? Can you, when they update their code, can you implant
what looks like a bug, even a typo that carries through? Or, say, look
at a system like Debian, the various kinds of Unix systems. Look at all
the packages they include. Look at the upstream binaries -- dependencies
upon dependencies upon dependencies. All you need to do is compromise
one of these dependencies and then there's a flow through and these all
get embedded. I mean, these modern systems now, are assemblages of
incredible intellectual content which is being developed all over the
world over the past 10 years by many different players. It is the nature
of our CPUs that there is only a few, you know, maybe 3, different
security layers in our systems. But when you pull together thousands of
packages all together it's pretty hard to resist the security
compromises that are engineered by nation-states. It doesn't mean that
it's not worth trying and increasing the cost of owning the world.
===========================================
I don't immediately see how interdependency is inherently insecure nor
do I see how interdependency is avoidable. I would like to know more
about a design that avoids interdependency. I figure this is basically
impossible because every OS I know of runs programs atop system
libraries. So if there's a vulnerability in a system library, every
application inherits that vulnerability unless it takes steps to work
around the issue. In the Free Software world I doubt developers do this
because developers can patch the system code and use the system code as
intended.
The only approach I can see to solving this issue is the hard work
developers and distributors should be doing anyhow: greatly reduce the
number of packages in the distribution to those packages one can vet,
and then keep up with vetting source code and updates for those
packages. This is certainly work worth paying people to do (in other
words, a commercial opportunity unique to Free Software). Any OS
distribution aiming to do this would be wise to start with a 100% Free
Software system like Trisquel. By the way, I do not mean to say "open
source" or "FLOSS" here instead of "Free Software". The open source
movement is ready to accept non-free software out of convenience and
adherence to its developmental methodology which was designed to ignore
software freedom. Such goals directly contradict the purpose of the work
I just described.
> Also: author seems to think Red Hat rules the GNU/Linux universe. What?
The blog author doesn't defend some of the points made in the article on
http://igurublog.wordpress.com/2014/04/08/julian-assange-debian-is-owned-by-the-nsa/
therefore I can't take that article too seriously.
A lot of that article is guilt by association -- Red Hat has US
government contracts, therefore Red Hat is a suspect in getting
insecurities into software they distribute (software that could be
distributed further by others, such as the Linux kernel). There's little
point in distinguishing between proprietors and the US Government, but
that doesn't make the Heartbleed bug an NSA plant nor does it mean
"Finland outed the NSA here". There's no clear evidence of Heartbleed
being anything but a mistake.
I don't buy guilt-by-association reasoning in this context precisely
because of the freedoms of Free Software -- so long as people have these
freedoms we have the tools we need to look out for our interests if
we're willing to apply the rigorous inspection and questioning that we
also require. Eliminating non-free software is a major step down that
road (thanks to Linux-libre and every OS distribution that distributes a
100% free system!).
If Red Hat were contracting for a non-free software distributor (such as
Apple, Microsoft, or Google) I'd certainly wonder about Red Hat's
involvement. Users of proprietary software aren't allowed to inspect,
alter, or distribute derivative programs. Those users have plenty of
reason to believe they're running malware
(https://gnu.org/philosophy/proprietary.html) and thus ample reason to
reject that software outright if they care about ethics, insecurity, or
software freedom.
Another example of an unexplained assertion in that blog post is "So it
comes as no surprise to me that they jumped on board systemd when told
to, despite the mock choice publicized to users – there was never any
option.". I don't understand how picking systemd over some other
mechanism to do the same job has a connection to letting in sabotaged
code leading to security exploits.
Just to clarify for those who skim through the thread without clicking on the links, the link above by alimiracle (https://twitter.com/wikileaks/status/454246967124963328) is Wikileaks denying the claim in the OP's original article, which was spread on the usual conspiracy channels but has no basis in reality.
"None of our people said this. Mr. Assange spoke about vulnerability of OS's to bribes and bugdoors in upstream components."
Exactly
Not true.
The people at WikiLeaks are denying that it was one of them who stated what is said in the title of that article - and, they're not necessarily refuting that author's /interpretation/ of the facts (which, as I stated below - https://trisquel.info/en/forum/julian-assange-debian-owned-nsa#comment-51971 - I consider to be a valid one, when the use of the term "owned" is clarified).
While the OP claim against Debian isn't entirely valid (Julian never stated that OS in the video interview), much of the information is based on truth. We do know that many systems are intentionally infected - See: To Protect & Infect [303c3], and also the trail of strange Debian bugs which leave you extra vulnerable. https://ftp.ccc.de/congress/30C3/webm/
As Grsec states wisely concerning the GNU/Linux kernel development -
"The “many eyes” of open source are blind, uninterested, or selling to governments for profit (it’s not the 1992 AD scene anymore)" - The Case for Grsecruity https://grsecurity.net/the_case_for_grsecurity.pdf
While these systems are open, not enough people are pentesting them for vulnerabilities.
Open source provide the feature that everybody can look into the source code. That's great. But as far as I can see only very few people have the time, the skill, and the will to understand and audit critical components of GNU/Linux.
G4JC, it's good that you mentioned Grsecurity and provide the link. Grsecurity is one great step towards a more secure operating system as insanitybit.com said:
"The PaX team and Spender are consistently providing mitigation techniques that work to remove entire classes of vulnerabilities. Grsecurity/PaX has basically been ten steps ahead of every software security implementation, so watch that project if you want to know what defense is going to look like in a few years."
Many people are talking about security. But if you are trying to search Grsecurity in this forum, you will find only two posts. That's the reality.