snopes.com  

Go Back   snopes.com > Urban Legends > Computers

Reply
 
Thread Tools Display Modes
  #21  
Old 03 December 2007, 01:31 PM
Insensible Crier Insensible Crier is offline
 
Join Date: 30 June 2006
Location: Pittsburgh, PA
Posts: 2,438
Default

Quote:
Originally Posted by Troberg View Post
Ariane V was one rocket, Vista and IE are installed on millions of computers, many of them running critical systems, and have serious security and stability flaws. Which has the best potential for large scale destruction?
You're being a bit alarmist here. If Vista has an error similar to Ariane 5, at worst a program crashes and you just restart the thing. No big deal. Even if the whole computer dies a horrible death, that's what backup and DR sites are for. Despite what you see in movies like Live Free or Die Hard, most computer systems are not fragile little networks where one logical error causes a huge catastrophe.

Plus critical systems are usually on dedicated servers running server software, not Vista and are not being used to browse web pages with IE. Plus there's a hell of a lot more to system security than what operating system and software is installed.

There is nothing to support that Vista and IE and going to cause some massive outage and disaster so there's no reason to be Chicken Little screaming that something akin to Y2K is about to happen because of Microsoft.
Reply With Quote
  #22  
Old 03 December 2007, 02:18 PM
Troberg Troberg is offline
 
 
Join Date: 04 November 2005
Location: Borlänge, Sweden
Posts: 11,580
Default

Quote:
You're being a bit alarmist here. If Vista has an error similar to Ariane 5, at worst a program crashes and you just restart the thing. No big deal. Even if the whole computer dies a horrible death, that's what backup and DR sites are for. Despite what you see in movies like Live Free or Die Hard, most computer systems are not fragile little networks where one logical error causes a huge catastrophe.
All it takes is one good exploit, and a large chunk of the Windows machines may be down, possibly with wiped disks and wiped BIOS. Just look at what the relatively benign viruses that struck IIS five years or so ago caused. My server (not running IIS) logged over half a million attacks from those viruses in two months. That probably meant that just about every IIS that wasn't properly patched was infected.

Quote:
Plus critical systems are usually on dedicated servers running server software, not Vista and are not being used to browse web pages with IE. Plus there's a hell of a lot more to system security than what operating system and software is installed.
True, but sometimes a lot of damage can be done by just wiping the end user machines, even if the database survives. Sure, the world will not end, but how much will it cost if, say, 50% of the Windows machines needs to be reinstalled?

Security is a lot more than OS and software, but they are two of the key components.

Quote:
There is nothing to support that Vista and IE and going to cause some massive outage and disaster so there's no reason to be Chicken Little screaming that something akin to Y2K is about to happen because of Microsoft.
Well, since Y2k never became a problem, it's not a good example.

The basic problem is that MS can not make a safe product. I've written a long post on this a while ago, and I have no time to write it again. If you like to babelfish it, I have it in Swedish here: http://rpglab.net/troberg/pmwiki.php/Main/Microsoft
Reply With Quote
  #23  
Old 03 December 2007, 03:22 PM
BluesScale BluesScale is offline
 
Join Date: 29 December 2005
Location: Woolhampton, Berkshire, UK
Posts: 1,355
Default

"All it takes is one good exploit, and a large chunk of the Windows machines may be down, possibly with wiped disks and wiped BIOS. Just look at what the relatively benign viruses that struck IIS five years or so ago caused. My server (not running IIS) logged over half a million attacks from those viruses in two months. That probably meant that just about every IIS that wasn't properly patched was infected."


Hmmm. Well, no. I was involved in the cleanup for that one. A lot of machines were infected, it is true but firewalls protected some and quite a few were unaffected because they were intranet servers - and many ISVs used packet filtering to protect systems. Of course, many systems were properly patched. Well patched systems have rarely been hit by worms, well, not since Code Red anyhow. The patch for Nimda preceded the malware by 331 days. The patch for Slammer preceded the malware by 180 days. Nachi was 151 days. Blaster was only 25. Sasser was 17 days late. Zotob was 9 days. Also, trends are for less destructive malware since Botnets are where the money is.


In fairness, unpatched Linux is pretty easy to break as well.

"True, but sometimes a lot of damage can be done by just wiping the end user machines, even if the database survives. Sure, the world will not end, but how much will it cost if, say, 50% of the Windows machines needs to be reinstalled?"

Quite a bit. Even Code Red didn't hit that high a proportion of machines though and defense in depth is hugely better now, of course. However, I don't see this as an argument for one OS over another. Imagine that Linux was on 90% of systems and a Linux based malware did the same. It would be just as bad. Linux systems are just as vulnerable as any other.

"The basic problem is that MS can not make a safe product."

Hmmm. Well, yes. Nor can any company. Perfect safety is impossible in a connected world. All that can be done is risk reduction. I can tell you that (at present) 77% of malware infections are due to users installing malware via social engineering. 23% are due to security flaws, many of which are patched if you install the updates. Linux does not have a better profile for security vulnerabilities and has a worse record for regressions caused by kernel fixes.

As for Vista being a disaster because it is so vulnerable... well, no. The facts don't bear that out at all. UAC (annoying though it is) blocks a lot of exploits. Restricted mode in IE and WinMail block a lot more. Random address relocation in certain modules breaks most exploit mechanisms even if they use heap spraying. The new parsers in Office 2007 stop whole classes of exploit... and it shows. There are very few Vista systems which are compromised and I have only seen one that was not compromised due to social engineering.

I have the upmost respect for your opinions, Troberg, but I am afraid that I can not agree with your data or your conclusions - and analysis of this sort of thing is what I do for a living.

Blues
Reply With Quote
  #24  
Old 03 December 2007, 04:15 PM
Troberg Troberg is offline
 
 
Join Date: 04 November 2005
Location: Borlänge, Sweden
Posts: 11,580
Default

Quote:
In fairness, unpatched Linux is pretty easy to break as well.
It depends a lot in which Linux you are talking about, most of them are pretty tough.

Quote:
Quite a bit. Even Code Red didn't hit that high a proportion of machines though and defense in depth is hugely better now, of course.
I would say that it hit almost 100% of the unprotected servers, and they are a large part of the internet. Many put up servers, get them running and then don't touch anything as long as they keep running. Dangerous, but that's the way a lot of people think.

Quote:
Imagine that Linux was on 90% of systems and a Linux based malware did the same. It would be just as bad. Linux systems are just as vulnerable as any other.
Wrong. Being open source means that most vulnerabilities are found before they are even put into the code base of a release version. Security can be verified, something that can never truly be done on a closed source system.

Quote:
Hmmm. Well, yes. Nor can any company. Perfect safety is impossible in a connected world. All that can be done is risk reduction.
That's not what I was talking about. I'm talking about MS being unable to make safety a reasonable priority.

I hoped I didn't have to translate my reasoning from the linked page, but here goes in a condensed form (someone with a stronger search fu can find the full version somewhere on this board):

* Lack of competence. Sure, the competence exist within the company, but with the recent hiring sprees it has become diluted.

* Dangerous focus. MS has always chosen user friendlyness before safety, and many of the most stupid and dangerous exploits stems from this doctrine.

* Trust in retroactive security. If MS built houses, they would not have any locks or alarms until someone actually broke into the house, and even so, only that specific entrance would be locked. Security must be proactive and built into the product from the start.

* Building on a bad base. Barts of Windows is 20 years old, way back when networking was part of DOS. Sure, they have been recompiled for 32-bit and 64-bit, but the basic design and thoughts still has a 20 year old heritage. A full dump-and-rewrite is about time and has been for a while.

*Trust in extra products for safety. If the OS and the products used are safe, a firewall is not needed. A firewall is like saying "I didn't bother to lock the door, but I built a fence around the house instead". It's even a false sense of security, as most attacks are indistinguishable from normal traffic for the firewall. If the fence allows the postman to enter with presents, it will also allow mail bombs to enter.

* Drivers run outside kernel control, which means that they are outside OS control. This allows nasties like rootkits and allows a broken driver to kill the entire OS. In most other OS's drivers run like any other program, firmly under OS control.

* Various measures are implemented or will be implemented that removes the ability of the user to control what's running. Earlier under the Palladium code name, now Next Generation Secure Computing ( http://en.wikipedia.org/wiki/Next_Ge...Computing_Base ). It's important that the user has the ability to check everything that runs and can allow or disallow anything from running. Sure, most users don't care, but enough do to make sure security problems are detected. Look at how Mark Russinovitch blew the lid off Sony's rootkit a while ago, that would not have been possible with these restrictions.

* MS Update / Windows update. This is a big eff upp from beginning to end. They don't tell clearly what the updates will do, it has security problems (twice, I've recieved viruses that way, luckily my virus scanner killed them), it installs stuff I don't need (why would I need Offic updates or mediaplayers on Windows Server?). It's injecting foreign, unverifiable code into my system.

* 9 out of 10 MS employees eat small children, according to estimates made by Linux experts. This gives them very little time to think about security issues.

* MS Update again. A paradox is that it is more or less needed to fix everything that's not safe from the start, even though it's a black box and the constant need for restarts makes it impossible to use for many servers where uptime requirements don't allow for constant reboots. A server needs to be stable, mystery updates are too big of a risk.

* Windows is closed source. This means that security can not be verified. No independent experts can check the source, and we are left in a situation where we must trust the word of MS. It also means that security issues are not found until they are exploited, unlike open source where there are hordes of developers who get a sexual kick out of pointing out bugs in other people's code. It's like the reason why you should use public encryption algorithms. You know the sharpest brains has done their best to crack them and has failed. A "secret" algorithm is untested and you can't trust it.

* MS has "educated" their users to believe that a constant flood of patches means security, while the real story is the opposite. The need for patches comes from an unsafe system. Now people think that an OS that does not have five patches a day is unsafe. Silly, but they believe it.

Quote:
I have the upmost respect for your opinions, Troberg, but I am afraid that I can not agree with your data or your conclusions - and analysis of this sort of thing is what I do for a living.
I think we'll just have to agree to disagree. As a developer, previous sysadmin and an average home user with 25-something computers on my home network, my experience does not agree with yours.

On my Windows machines, security is a constant concern and up to date virus scanners, firewalls and stopping of a whole bunch of services which Windows starts as default and a choice of another web browser is needed to keep them relatively safe. On the Linux machines, security is more or less a non-issue. Basic common sense only and I've yet to see a successful attack.
Reply With Quote
  #25  
Old 03 December 2007, 04:40 PM
diddy's Avatar
diddy diddy is offline
 
Join Date: 07 March 2004
Location: Plymouth, MN
Posts: 10,892
Default

Troberg: Great analysis but there is one point that I would like to add:

* user trust: I feel that MS heavily relies on the users ability to know what is safe for their computer and what is not. Combine this with their ease of use approach and you have a problem. Users cannot be implicitly trusted with full system level access to their system as most users do on a regular basis. UAC helps, but only to the degree that it annoys users and trains them to just hit "ok" to bypass something that is annoying them. With UAC, the control is still in the users hands and experience tells me that people will hit OK even to things they don't understand.
Reply With Quote
  #26  
Old 03 December 2007, 06:12 PM
Troberg Troberg is offline
 
 
Join Date: 04 November 2005
Location: Borlänge, Sweden
Posts: 11,580
Default

Quote:
experience tells me that people will hit OK even to things they don't understand.
Especially to things they don't understand...
Reply With Quote
  #27  
Old 03 December 2007, 06:24 PM
BluesScale BluesScale is offline
 
Join Date: 29 December 2005
Location: Woolhampton, Berkshire, UK
Posts: 1,355
Default

Ok, cards on the table. I work for Microsoft and I am a security specialist. However, what I am saying here is my own personal opinion and I am not speaking on behalf of Microsoft.

“It depends a lot in which Linux you are talking about, most of them are pretty tough.”

Buffer overrun vulnerabilities exist in Linux – and in Windows and in thousands of third party applications. Because of the way that machine codes uses the stack, a carefully written exploit can work with the security features of Linux or Windows. I am not going to give sample code for obvious reasons but I have seen exploits for both platforms using the exact same approach. A glance through the CVE lists will take you to the same conclusion, I would suggest.

“I would say that it hit almost 100% of the unprotected servers, and they are a large part of the internet.”

The peak was on July 19th, 2001 when the infected total got to a shade under 360,000 systems. However, 100% of the unprotected servers? No-one has the figure for how many unprotected servers there were. Maybe 360,000 was 10% or 100% but there is no way of knowing. My estimate would be around 65% or 70% but that is just a guess.

“Wrong. Being open source means that most vulnerabilities are found before they are even put into the code base of a release version. Security can be verified, something that can never truly be done on a closed source system.”

Uh, the assumption here is that closed source does not get code reviewed. I can assure you that this is NOT the case and it gets inspected. Old code gets reviewed periodically as well.

“Lack of competence. Sure, the competence exist within the company, but with the recent hiring sprees it has become diluted.”

Having reviewed both MS source code and third party code, I have to say that MS code is generally better. Yes, there are areas of code where the implementation quality is not as good as it would be in a perfect world but you and I have both been developers in the real world. That is true of every product. By and large, I disagree that MS has less competent developers than a self appointed group of people who will write code for free.

“Dangerous focus. MS has always chosen user friendlyness before safety, and many of the most stupid and dangerous exploits stems from this doctrine.”

So, your argument is that less user friendly code is safer code? With respect, I don’t see how that can be. What the UI looks like has nothing to do with whether the code can be subjected to shatter attacks, buffer overruns, forced race conditions and all the rest. Also, I seem to recall you claiming that some of the Linux variants were more user friendly than Windows. Are you sure that you are not arguing against yourself here? I am sorry but I disagree with you here also. The most dangerous exploits were all to services which failed to validate passed data properly.

“Building on a bad base. Parts of Windows is 20 years old, way back when networking was part of DOS. Sure, they have been recompiled for 32-bit and 64-bit, but the basic design and thoughts still has a 20 year old heritage. A full dump-and-rewrite is about time and has been for a while”

Hmmm. Well, there is some truth here. Unfortunately, application compatibility is a critical issue to many customers. They want all their software to keep running. If large areas of the OS are radically redesigned breaking thousands of applications, people object. So, a lot of code does have to be carried forward – and in 1985, security was not the issue that it is now. The world was not online then. Most computers were standalone. The first PC virus was still a year in the future and the term “Virus” for self-replicating malware was a year old. Could the company have guessed that the internet was going to happen? No, I don’t think that anyone figured that one out.

“Trust in extra products for safety.”

Hmmm. If we bundle stuff, we are anti-competitive. If we don’t then we are failing to provide a full solution. Seems like a cleft stick to me.

“If the OS and the products used are safe, a firewall is not needed.”

Linux systems use firewalls too and for the same reason as Windows systems. Defence is depth is good. A single line of defence is bad. Why? Because with a single line of defence, any breach and you got owned. With defence in depth, a compromise is contained. Look at a nuclear reactor some time – layers surrounded by layers.

“It's even a false sense of security, as most attacks are indistinguishable from normal traffic for the firewall.”

Uh, depends on the firewall. Some of the Cisco firewalls and ISS filters can detect specific exploit attempts. However, the point is that the job of the firewall is to prevent attacks from outside the wall getting through the wall. It does that fine – not that there is anything Windows specific about that.

“Drivers run outside kernel control, which means that they are outside OS control. This allows nasties like rootkits and allows a broken driver to kill the entire OS. In most other OS's drivers run like any other program, firmly under OS control.”

First rootkits were Unix based. Sorry.

Yes, NGSCB does mean that it is harder for the user to control what is running in some circumstances. 77% of malware is installed by users who have been socially engineered. As long as the user still gets the functionality that they need, insisting on code being known and trusted seems like a good thing. It is putting Safety before friendliness which is what you wanted earlier.

“MS Update / Windows update. This is a big eff upp from beginning to end. They don't tell clearly what the updates will do”

Updates are intentionally vague because if we tell people exactly what we fixed, we immediately tell all the Blackhats how to exploit unpatched systems.

“It has security problems (twice, I've recieved viruses that way, luckily my virus scanner killed them)”

I am terribly sorry but I think that you are mistaken. We have NEVER had a single report of this happening. Updates go on to more than 60 million systems. If there was malware in the source (and they ARE signed and the WU engine uses very good cryptography) then we would have thousands of reports if not millions.

“it installs stuff I don't need (why would I need Office updates or mediaplayers on Windows Server?”

You are suggesting that vulnerabilities should be left because users are not recommended to use that OS feature on a server? I am sorry but that strikes me as high risk.

“ It's injecting foreign, unverifiable code into my system.”

No more foreign than the rest of the OS.

“9 out of 10 MS employees eat small children, according to estimates made by Linux experts. This gives them very little time to think about security issues.”

Pah, not true. We eat small children only when we have run out of cute kittens to chow down on. But, in fairness, we do eat them live.

“A server needs to be stable, mystery updates are too big of a risk.”

Yes, there is a risk in installing an update – any change can be a danger. If you would sooner not then you may well end up speaking to me or my colleagues as we help you clean the box.

“Windows is closed source. This means that security can not be verified. No independent experts can check the source, and we are left in a situation where we must trust the word of MS.”

Not all that closed. Want a copy?
http://www.microsoft.com/resources/s...g/windows.mspx

“MS has "educated" their users to believe that a constant flood of patches means security, while the real story is the opposite. The need for patches comes from an unsafe system. Now people think that an OS that does not have five patches a day is unsafe. Silly, but they believe it.”

Uh, I am sorry again. Do you think that holes should be patched or not? I agree that there would be no need in a perfect world with a clean room implementation knowing what we learned the hard way. That is not where the world is coming from. Also, Open Source stuff gets updates too.

“I think we'll just have to agree to disagree. As a developer, previous sysadmin and an average home user with 25-something computers on my home network, my experience does not agree with yours.”

Ok, as a professional making my living in this specific field, we will have to agree to disagree.

“On my Windows machines, security is a constant concern and up to date virus scanners, firewalls and stopping of a whole bunch of services which Windows starts as default and a choice of another web browser is needed to keep them relatively safe.”

I agree on the need to keep virus scanners up to date on ANY operating system. There is nothing windows specific about that. I agree that fewer services means a smaller attack surface which is better. Later operating systems (Server 2003, Server 2008) have fewer services enabled by default. As for the web browser, I have seen the claims as to why Firefox/Mozilla is more secure. They do not hold water. The technique that a plug-in would use to subvert is slightly different to what a BHO would do but the effect is the same and there is no difference in difficulty.

“On the Linux machines, security is more or less a non-issue. Basic common sense only and I've yet to see a successful attack.”

Well, there is very little that runs on Linux Kidding

In all seriousness, there are attacks on Linux too. Metasploit is a full of them

Blues
Reply With Quote
  #28  
Old 03 December 2007, 09:08 PM
Insensible Crier Insensible Crier is offline
 
Join Date: 30 June 2006
Location: Pittsburgh, PA
Posts: 2,438
Default

In my experience, the biggest IT disasters have nothing to do with OS, software or hardware. It is often the result of someone being incredibly stupid, lazy, short-sighted or flat out incompetent. Let me know when any operating system has a patch or an update to fix those.
Reply With Quote
  #29  
Old 03 December 2007, 09:09 PM
BluesScale BluesScale is offline
 
Join Date: 29 December 2005
Location: Woolhampton, Berkshire, UK
Posts: 1,355
Default

Sorry, currently blocked waiting the release of Human 2.0

Blues
Reply With Quote
  #30  
Old 04 December 2007, 04:48 AM
Troberg Troberg is offline
 
 
Join Date: 04 November 2005
Location: Borlänge, Sweden
Posts: 11,580
Default

Quote:
Uh, the assumption here is that closed source does not get code reviewed. I can assure you that this is NOT the case and it gets inspected. Old code gets reviewed periodically as well.
Sure it does, but not to the same extent. Have you seen the recent discussions on the alternatives for a new task scheduler in Linux? It's a lot of people looking into that code, nitpicking about every detail.

Quote:
By and large, I disagree that MS has less competent developers than a self appointed group of people who will write code for free.
I'm not so sure about that. OS is a meritocracy. Those who gain influence are the good ones, the ones who produce good code.

Quote:
So, your argument is that less user friendly code is safer code?
No, it's that safety must come first when a decision has to be made where safety and user friendlyness come into conflict.

Quote:
With respect, I don’t see how that can be. What the UI looks like has nothing to do with whether the code can be subjected to shatter attacks, buffer overruns, forced race conditions and all the rest.
Well, Windows has had some really stupid exploit possibilities based on friendlyness, such as automatically running attached files in emails or allowing ActiveX in web pages which allows a web page to do whatever it likes with the machine. Sure, they were eventually fixed, or at least made possible to turn off, but what were they thinking when they put it in in the first place?

Quote:
“Trust in extra products for safety.”

Hmmm. If we bundle stuff, we are anti-competitive. If we don’t then we are failing to provide a full solution. Seems like a cleft stick to me.
I'm not talking about bundling the extra products, I'm talking about building a system secure so that they are not needed.

Quote:
Linux systems use firewalls too and for the same reason as Windows systems. Defence is depth is good. A single line of defence is bad. Why? Because with a single line of defence, any breach and you got owned. With defence in depth, a compromise is contained. Look at a nuclear reactor some time – layers surrounded by layers.
On most Linux systems I've seen, a firewall has been used to secure the outer shell, the entrances to the network. Inside the network, no firewalls are needed. That's a legitimate use, for many reasons. However, a firewall on every workstation is just adding overhead and should not really be needed.

Quote:
“It's even a false sense of security, as most attacks are indistinguishable from normal traffic for the firewall.”

Uh, depends on the firewall.
Correct, but for consumer products, not expensive coorporate products, they are indistinguishable. There is also an overhead of looking into the packets instead of just looking at which port they are headed for, as well as a risk of filtering out good packets which can cause strange errors which are hard to track.

Quote:
First rootkits were Unix based. Sorry.
Yep, and no they have learned. It's much more difficult today.

I'm not making the common mistake of comparing Linux to an imaginary, perfect OS. There are way too many people pointing at some bug in Linux and going "But look, Linux also has bugs!". Sure, it's not without problems, but compared to the competition, it's much better.

Quote:
Yes, NGSCB does mean that it is harder for the user to control what is running in some circumstances. 77% of malware is installed by users who have been socially engineered. As long as the user still gets the functionality that they need, insisting on code being known and trusted seems like a good thing. It is putting Safety before friendliness which is what you wanted earlier.
Except that hiding things from the user is "safety through obscurity", which has been disproven time and time again. We also have the moral issues. The hardware belongs to the user. What right does MS or someone else have to run software hidden from the user, hidden from the owner of the system?

Quote:
Updates are intentionally vague because if we tell people exactly what we fixed, we immediately tell all the Blackhats how to exploit unpatched systems.
Yep, so that's why I'm not told that it will install a new IE I don't want, even though I've spent half a day removing as much of IE as I could from my system and replacing as much as I could of what I couldn't remove with stub DLL's?

Once again, "security through obscurity" does not work. Those who wish to exploit the system has access to Google, they'll find the information they need.

Quote:
I am terribly sorry but I think that you are mistaken. We have NEVER had a single report of this happening. Updates go on to more than 60 million systems. If there was malware in the source (and they ARE signed and the WU engine uses very good cryptography) then we would have thousands of reports if not millions.
In both cases, it was on a freshly installed Win2000 server on a virus free (at least the virus scanner did not detect anyting) network behind a firewall. I ran update, and suddenly the virus scanner on the machine started screaming bloody murder, and after checking closer, it was the nasty, but rather fun (in a nasty way) w32.magistr virus in one of the update packages.

It might be that MS is not to blame, in that case someone has gone through the bother to do a DNS spoof and set up their own server mimicking the MS server. Still, if this is possible, it's a security flaw in MS Update.

Quote:
You are suggesting that vulnerabilities should be left because users are not recommended to use that OS feature on a server? I am sorry but that strikes me as high risk.
I'm saying that if those products are not installed, the patches should not be installed, and in the case of the media player, the product should not be installed at all, since I've already chosen to not install it.

Quote:
“ It's injecting foreign, unverifiable code into my system.”

No more foreign than the rest of the OS.
It's code I don't have any chance to verify before it's installed.

Quote:
Yes, there is a risk in installing an update – any change can be a danger. If you would sooner not then you may well end up speaking to me or my colleagues as we help you clean the box.
Or you could make the system secure enough to not need constant patching. You could also make patching safer and less intrusive. Look, for instance, at the Adept/apt-get packet manager. Install and uninstall on the fly, without rebooting, always safe.

This, by the way, is another problem with Windows. No centralised management of installed software. The system have no clear idea about which software uses which shared files, so uninstalling software sometimes breaks other software, and installing software sometimes breaks other software as well by installing another version of a file. This is a total non-issue in Linux, where the packet manager handles every install/uninstall and has complete control over each application's needs and makes sure that they are met.

Quote:
Not all that closed. Want a copy?
http://www.microsoft.com/resources/s...g/windows.mspx
Well, considering that I'll have to sign a stack of non disclosure agreements as high as the Eiffel tower, backed up by an army of attack lawyers, to see that code, I can't really see how it could work as a way of pointing out possible dangers.

Quote:
Uh, I am sorry again. Do you think that holes should be patched or not? I agree that there would be no need in a perfect world with a clean room implementation knowing what we learned the hard way. That is not where the world is coming from. Also, Open Source stuff gets updates too.
The thing is that most of them could and should have been seen before the system went into production. OSS also gets patches, but far less often, because it's inherently more safe.

Also, don't mix up security patches with updates. A security patch fix a possible exploit, an update provides new functionality.

Quote:
As for the web browser, I have seen the claims as to why Firefox/Mozilla is more secure. They do not hold water.
Well, they don't run ActiveX. That's one big hole not left wide open.

Quote:
In all seriousness, there are attacks on Linux too. Metasploit is a full of them
Sure, it's not perfect, just better.
Reply With Quote
  #31  
Old 04 December 2007, 11:48 AM
BluesScale BluesScale is offline
 
Join Date: 29 December 2005
Location: Woolhampton, Berkshire, UK
Posts: 1,355
Default

“Sure it does, but not to the same extent. Have you seen the recent discussions on the alternatives for a new task scheduler in Linux? It's a lot of people looking into that code, nitpicking about every detail.”

Hmmm. Ok, and how is it ensured that people review the boring bits of the code as well? Also, I suspect that a lot of the discussion is about required functionality and not how to make it attackproof.

“I'm not so sure about that. OS is a meritocracy. Those who gain influence are the good ones, the ones who produce good code.”

Yep. We promote good devs too.

“No, it's that safety must come first when a decision has to be made where safety and user friendlyness come into conflict.”

We try to do what customers want. Looking at the sales figures, I think that we have had some success in this regard. That said, some decisions were made in an environment that was different… and in a corporate environment, some of the decisions are better than they are for home users.

“Well, Windows has had some really stupid exploit possibilities based on friendlyness, such as automatically running attached files in emails or allowing ActiveX in web pages which allows a web page to do whatever it likes with the machine. Sure, they were eventually fixed, or at least made possible to turn off, but what were they thinking when they put it in in the first place?”

Again, the world changes. Again, inside a corporate LAN, these are perhaps better decisions. There were always some security restrictions with ActiveX controls and we always supporting signing of ActiveX controls. Additional functionality can often be misused and ActiveX was sometimes abused. Some controls that were not safe for scripting were marked as safe for scripting. Some were safe for scripting in that the functionality was perfectly OK but there were ways of injecting code through poor validation. However, that is very much a problem of the past rather than a current issue.

“I'm not talking about bundling the extra products, I'm talking about building a system secure so that they are not needed. On most Linux systems I've seen, a firewall has been used to secure the outer shell, the entrances to the network. Inside the network, no firewalls are needed.”

You always need multiple layers. Defence in depth is regarded as the best possible practice. In practice, every system will have some vulnerabilities (even hippy free love operating systems :-) ) and the question is what should happen if someone used one. Do you declare the game over or try to control the breach? Multiple layers, every time.

“Correct, but for consumer products, not expensive coorporate products, they are indistinguishable. There is also an overhead of looking into the packets instead of just looking at which port they are headed for, as well as a risk of filtering out good packets which can cause strange errors which are hard to track.”

Yes, corporate network needs are different to those of home users. Home users typically need limited traffic types over a few ports and so simpler firewalls work well for them. There is nothing OS specific about that though.

“I'm not making the common mistake of comparing Linux to an imaginary, perfect OS. There are way too many people pointing at some bug in Linux and going "But look, Linux also has bugs!". Sure, it's not without problems, but compared to the competition, it's much better.”

I disagree that it is much better. It is different. Some decisions are better, some are worse. The marketplace will decide that one.

“Except that hiding things from the user is "safety through obscurity", which has been disproven time and time again. We also have the moral issues. The hardware belongs to the user. What right does MS or someone else have to run software hidden from the user, hidden from the owner of the system?”

Choose please. You say that we give the user too much control and then argue that the user should have complete freedom. You can have one or the other. Also, how is this safety through obscurity? By giving the OS complete control, nothing else has control.

“Yep, so that's why I'm not told that it will install a new IE I don't want, even though I've spent half a day removing as much of IE as I could from my system and replacing as much as I could of what I couldn't remove with stub DLL's?”

With all possible respect, IE is intrinsic to the OS these days and that was shown in court – and again, with all possible respect, yours is not a typical user scenario. WU/MU is designed for normal home users, not people who mod the OS. If you want more control, you can get the updates individually if you want then. WU/MU is not enforced on you.

“In both cases, it was on a freshly installed Win2000 server on a virus free (at least the virus scanner did not detect anyting) network behind a firewall. I ran update, and suddenly the virus scanner on the machine started screaming bloody murder, and after checking closer, it was the nasty, but rather fun (in a nasty way) w32.magistr virus in one of the update packages.”

Uh, w32.magistr has only ever been seen as a standalone EXE. What mechanism could possibly get this on Windows Update? I strongly suspect that you were compromised through some other mechanism. You did have an unpatched system on the web, after all. It is very difficult to set up a spoof server because of the token exchange. I have never seen it done successfully.


“I'm saying that if those products are not installed, the patches should not be installed, and in the case of the media player, the product should not be installed at all, since I've already chosen to not install it.”

Ok. Windows 2008 can be installed without even a GUI so I think that we have addressed that point.

“It's code I don't have any chance to verify before it's installed.”

Yes. You are unable to do a code review of new code before installation. I think that you will agree that this is not a mass market requirement and is true of all non-OSS software.

“You could make the system secure enough to not need constant patching. You could also make patching safer and less intrusive”

Hmmm. I think that we have discussed the reason for legacy code. This is a requirement of legacy code. There is no way to make each process unload and reload a component without restarting the processes. After all, a DLL will be holding state. If a component is still in use, replacing it is problematic. It is easy enough to do this with a service like IIS where you can stop and restart the service but much harder when you are talking about replacing a component used by applications which we have never seen before.

“This, by the way, is another problem with Windows. No centralised management of installed software. The system have no clear idea about which software uses which shared files, so uninstalling software sometimes breaks other software, and installing software sometimes breaks other software as well by installing another version of a file. This is a total non-issue in Linux, where the packet manager handles every install/uninstall and has complete control over each application's needs and makes sure that they are met.”

Yes, I agree that the installation/uninstallation of apps in Windows is clunky and that something better is needed.

“Well, considering that I'll have to sign a stack of non disclosure agreements as high as the Eiffel tower, backed up by an army of attack lawyers, to see that code, I can't really see how it could work as a way of pointing out possible dangers.”
Yet it works. If you look at the security bulletins, we credit the finder.

“The thing is that most of them could and should have been seen before the system went into production. OSS also gets patches, but far less often, because it's inherently more safe.”

No, I don’t agree that it is inherently safer. It is inherently less attacked and accordingly less often compromised. Blackhats target the mass market.

“Also, don't mix up security patches with updates. A security patch fix a possible exploit, an update provides new functionality.”

No, an upgrade provides new functionality. An update is just a later version. Easy mistake to make though.

Quote:
As for the web browser, I have seen the claims as to why Firefox/Mozilla is more secure. They do not hold water.

“Well, they don't run ActiveX. That's one big hole not left wide open.”

Actually, there is a plug-in to allow Firefox to host ActiveX. Plug-ins and BHOs are more commonly malicious than ActiveX controls since they stay loaded unlike ActiveX controls where the lifetime of the component is limited to that of the page. An ActiveX control could be used to install a Trojan but normally we see .exe files used for this with social engineering.

Quote:
In all seriousness, there are attacks on Linux too. Metasploit is a full of them

“Sure, it's not perfect, just better.”

Objectively? It is a religious argument and you will find people who prefer one over the other. Is Linux better? It has good and bad points.

However, we seem to have strayed a fair way from “Vista is teh debil and will make your man bits fall off”. Operating systems are complex things and no one OS is all good or all bad (well, ok, Millennium came close). By all means slam us when we make mistakes but Windows works well for many millions of people and, if they follow the guidelines (which are default in XP SP2 and later) they will enjoy a good level of safety. We are working every day of the year, 24 hours a day to make things safer and better. Vista is safer than XP. 2008 is safer than 2003. 64 bit is almost bulletproof judging by the number of reports that we are seeing.

Blues.
Reply With Quote
  #32  
Old 04 December 2007, 02:23 PM
Troberg Troberg is offline
 
 
Join Date: 04 November 2005
Location: Borlänge, Sweden
Posts: 11,580
Default

Quote:
Hmmm. Ok, and how is it ensured that people review the boring bits of the code as well? Also, I suspect that a lot of the discussion is about required functionality and not how to make it attackproof.
How much more boring than a task scheduler does it get?

In this case, the discussion was about percieved performance, but where it's more relevant, security is just as actively discussed. The important thing is that many experienced developers (and shitloads of less experienced) look at the code.

Quote:
Yep. We promote good devs too.
Now I'm speaking about personal experience, but in my experience, in a large organization, people are promoted to their level of incompetence. As long as they manage to do their job, they get promoted. People seldom get pushed down a step, though, so they tend to get to a point where they can't do a proper job, at which point they get stuck there (unless they get promoted to another department in order to get rid of them).

Quote:
We try to do what customers want. Looking at the sales figures, I think that we have had some success in this regard.
Sure, I don't doubt that. The problem is that users don't really know what's best for them. They want to look at web sites with bouncing balls and bells and whistles, but they don't understand that they also open themselves up for attack by doing that. That's why we have experts.

Quote:
Again, the world changes. Again, inside a corporate LAN, these are perhaps better decisions. There were always some security restrictions with ActiveX controls and we always supporting signing of ActiveX controls. Additional functionality can often be misused and ActiveX was sometimes abused. Some controls that were not safe for scripting were marked as safe for scripting. Some were safe for scripting in that the functionality was perfectly OK but there were ways of injecting code through poor validation.
The problem with those schemes is that they either rely on a competent user or a nice producer of ActiveX controls. What do some hacker in Nigeria care about if his signed control can be traced to him by MS if he can empty a bunch of bank accounts and transfer the contents to his own?

Besides, even if everything works as designed, there are still issues. We had problems when a popular Swedish site used an ActiveX control that used some older version of a runtime file. Whenever one of our users went to that site, database access in our software stopped working and we had to manually replace the file with the correct one. Then they went ot that site again...

Quote:
You always need multiple layers. Defence in depth is regarded as the best possible practice. In practice, every system will have some vulnerabilities (even hippy free love operating systems :-) ) and the question is what should happen if someone used one. Do you declare the game over or try to control the breach? Multiple layers, every time.
If they get through one firewall, the next one will most likely be breachable with the same method.

I use a strong shell protection together with an active and aggressive response to detected attacks. It's just not worth the sacrifice in performance and cost to use it inside my private network.

Quote:
Yes, corporate network needs are different to those of home users. Home users typically need limited traffic types over a few ports and so simpler firewalls work well for them. There is nothing OS specific about that though.
In my experience, it's the other way around. Companies need web access, mail, maybe FTP and that's more or less it. Home users listen to radio, skype, use file sharing, play games and so on.

Quote:
I disagree that it is much better. It is different. Some decisions are better, some are worse. The marketplace will decide that one.
The market would decide if it was a level playfield. It's about as level as Switzerland.

Quote:
Choose please. You say that we give the user too much control and then argue that the user should have complete freedom. You can have one or the other. Also, how is this safety through obscurity? By giving the OS complete control, nothing else has control.
Give the user full possibilities to do what they like with their machine, but make the environment safe. Look at Linux, where all potentially dangerous operations require you to enter a root password. This means that you can do it, but an attacker can not do it even if they get access to the console and a malicious software can't do it. They don't have the password, so they can't do it. Linux also have a much clearer divide between operations with different privilege requirements.

By giving the OS complete control, you have basically stolen the computer from the user.

Quote:
With all possible respect, IE is intrinsic to the OS these days and that was shown in court – and again, with all possible respect, yours is not a typical user scenario. WU/MU is designed for normal home users, not people who mod the OS. If you want more control, you can get the updates individually if you want then. WU/MU is not enforced on you.
So may be it, but I still want to have sneak updates made like that. It took me a lot of time to dig out all the required API's in the affected DLL's and write replacement stubs, and I didn't even get a warning "We are about to install IE vX.X, do you want to continue?". Keep WU/MU to patches only, and don't sneak new versions or additional software in under the guise of service packs.

Quote:
Uh, w32.magistr has only ever been seen as a standalone EXE. What mechanism could possibly get this on Windows Update? I strongly suspect that you were compromised through some other mechanism. You did have an unpatched system on the web, after all. It is very difficult to set up a spoof server because of the token exchange. I have never seen it done successfully.
Well, I got it. The machine was behind a firewall, the virus protection on my other machines would have detected it if I had it on the network. It was the virus detection on my firewall that found it in the data stream between the update server and the new machine, and I could repeat it by running WU/MU again. If it didn't come from MS, someone has managed to spoof it.

Quote:
Ok. Windows 2008 can be installed without even a GUI so I think that we have addressed that point.
Cool, I didn't know that. Is it still called "Windows" when there are no windows?

About time, is there any other changes to make it more of a server OS and less of a desktop OS?

Btw, somewhat off topic: Why is there no desktop Windows for 4 CPU's? It's not just servers that use multiple CPU's, high performance workstations also want the extra power.

Quote:
Yes. You are unable to do a code review of new code before installation. I think that you will agree that this is not a mass market requirement and is true of all non-OSS software.
Of course, most people don't do it. In fact, very few do it. It doesn't matter. The point is that enough people would do it to find potential problems.

As you say, it's true for most closed source software (some software provide the source, but don't allow you to alter or redistribute it, and thus don't qualify as OSS). On the other hand, most closed source don't require frequent patching.

Quote:
Hmmm. I think that we have discussed the reason for legacy code. This is a requirement of legacy code.
Yep, I know. That's an explanation, but not an excuse. It's also an unavoidable drawback of closed source. With open source, changes can be made to the OS, and all that's required for old apps is usually a recompile and the legacy code is shiny and new again. Somewhat simplified, of course, but not that far from the truth.

This, by the way, is also why Linux is ported to run on just about everything that has a CPU. All you need to adapt is gcc (the C compiler) and some header files. From there on, recompile stuff for the new platform and everything should, in theory, be nice and working. Of course, it's not that simple, as some bugs will appear and some adaptations have to be made, but it's still much simpler than if you don't have the source.

Quote:
No, I don’t agree that it is inherently safer. It is inherently less attacked and accordingly less often compromised. Blackhats target the mass market.
Sorry, but that doesn't hold water. The mass market for servers on the internet is Linux, yet it's still considered safer. Also, a Linux machine is a more desirable target if you are going to use it as a bot, which also makes them more tasty targets.

Quote:
No, an upgrade provides new functionality. An update is just a later version. Easy mistake to make though.
That depends on the company one works for. My point was that a patch is just that, a patch that plugs a hole, without installing any new stuff or changing the old stuff beyond what's needed to plug the hole.

Quote:
As for the web browser, I have seen the claims as to why Firefox/Mozilla is more secure. They do not hold water.
Perhaps, I have no experience with them, I use Opera and Konqueror, and they are certainly more secure than both IE and FF. Wikipedia seems to think that IE is worst, though: http://en.wikipedia.org/wiki/Compari...ulnerabilities

Quote:
Actually, there is a plug-in to allow Firefox to host ActiveX.
Don't get me wrong, I dislike that as well. If it was up to me, the web would have no active client-side content at all, all would be server-side. That would be safe, and would still provide the same basic functionality, minus some bouncing balls. Since gay porn is not my cup of tea, I can do without bouncing balls.

Quote:
In all seriousness, there are attacks on Linux too. Metasploit is a full of them
Want to count exploits per OS and compare? I think Windows will have more, even if they are scaled according to user base.

Quote:
Objectively? It is a religious argument and you will find people who prefer one over the other. Is Linux better? It has good and bad points.
It's better for me. The one thing I lack is a good rapid application development environment, otherwise I would have moved all my machines to Linux. There are a few, but they are either too expensive for my taste or too simple. No doubt, this will change, as developers are a lazy bunch who will do a 100 hour task by spending 99 hours on writing a tool that will make it in 1 hour.

Quote:
no one OS is all good or all bad (well, ok, Millennium came close).
Actually, I don't understand the hatred towards ME. In the Win95 branch of the family tree, it was the best (properly tuned and with some bloat stuff removed). Win98 was much worse. I stayed with Win95 until I needed USB, then jumped directly to ME (for games only, for other uses, I went the NT 3.1, NT 3.5, NT 4, 2000 route).

Quote:
By all means slam us when we make mistakes but Windows works well for many millions of people and, if they follow the guidelines (which are default in XP SP2 and later) they will enjoy a good level of safety.
How do you define "works well"? I agree that later versions are better (to some extent, I still don't like Vista, for many reasons, not all of them technical), but it still requires a re-install every two years or so if you are not very careful. Wierd problems still pop up, such as the incredibly annoying mup.sys problem that has killed three installations for me and which was actually what made me try Linux in the first place. Running a Windows machine (that's actively used, not just waiting around) for more than a month or two without a restart is still an impressive feat, compared to Linux and Novell machines that I've had over two years continious uptime and they would still be running if I hadn't had a power failure that forced the issue.

Quote:
We are working every day of the year, 24 hours a day to make things safer and better.
Ah, that explains Vista. You must be very, very tired.

Quote:
Vista is safer than XP. 2008 is safer than 2003. 64 bit is almost bulletproof judging by the number of reports that we are seeing.
How many run 64-bit?

In my opinion, the safest Windows today is a Windows 2000, properly patched and with unneeded services shut down. I've had two of those connected directly to the internet for over five years, without a firewall (I use them for file sharing, and my firewall can't really take the load) and there has not been a single successful attack against them.

Why is Win2000 safe? Well, because it's well known. The obscurity is gone, the exploits have been found and fixed and the holes are plugged. A new OS means a new set of exploits.
Reply With Quote
  #33  
Old 08 December 2007, 03:41 PM
BluesScale BluesScale is offline
 
Join Date: 29 December 2005
Location: Woolhampton, Berkshire, UK
Posts: 1,355
Default

Sorry for the slow reply. Been very busy.

“How much more boring than a task scheduler does it get?”

Have you every review printer driver code. When you have, then you will understand the meaning of the word “Tedium” 

“Now I'm speaking about personal experience, but in my experience, in a large organization, people are promoted to their level of incompetence.”

A common experience. In practice, one that we rarely hit in technical roles because people are not often promoted from seriously technical roles to managerial roles. Lower and middle managers are sometimes contract staff.

“The problem is that users don't really know what's best for them. … That's why we have experts”

Not an opinion that I expected from an advocate of open source. I disagree. We should provide customers with what they want as far as possible – and do what we can to make them safe. They may be ill-informed but they are adults, not children

“The problem with those schemes is that they either rely on a competent user or a nice producer of ActiveX controls.”

Again, we don’t get to make that kind of decisions for the nice customer. All we can do is offer them the choice.

“We had problems when a popular Swedish site used an ActiveX control that used some older version of a runtime file.”

Which is why “side by side” and manifests were implemented. It was a problem. It still is sometimes with older components.

“If they get through one firewall, the next one will most likely be breachable with the same method.”

Not in my experience or in case studies. Hardware and software files walls really do help. Most successful hacking attacks get around the firewall, most unsuccessful attacks flounder there. Firewalls protect very well against worms.

“Quote:
Yes, corporate network needs are different to those of home users. Home users typically need limited traffic types over a few ports and so simpler firewalls work well for them. There is nothing OS specific about that though.
In my experience, it's the other way around. Companies need web access, mail, maybe FTP and that's more or less it. Home users listen to radio, skype, use file sharing, play games and so on”

Well, yes and no. Home users have a simpler topology in that it is local or it is internet – and most of their access is via TCP or UDP. A corporate LAN will have RPC and file sharing and active directory and cross domain calls with limited trust and all sorts.

“Sorry, but that doesn't hold water. The mass market for servers on the internet is Linux, yet it's still considered safer. Also, a Linux machine is a more desirable target if you are going to use it as a bot, which also makes them more tasty targets.”

Uh, again, sorry, disagree. Scale out not up. A botnet needs a lot of PCs, not a few high end servers. Servers make poor bot hosts because they have more or less expert operators – and that is equally true of Linux or Windows. Indeed, many bots disable things on windows assuming that it is a workstation and break the server if that is the host.

“Want to count exploits per OS and compare? I think Windows will have more, even if they are scaled according to user base”
Ok, I am game

“I use Opera and Konqueror, and they are certainly more secure than both IE and FF. Wikipedia seems to think that IE is worst”

Who am I to disagree with Wikipedia? Bet you a dollar that they can be hacked.

Quote:
We are working every day of the year, 24 hours a day to make things safer and better.

Ah, that explains Vista. You must be very, very tired

Well, it is not just me We do have people working on security, standing by in case of a major incident, checking something all the time across the world. We even have drills as to what would happen if things went very bad indeed. You simply don’t get that with open source.

“How many run 64-bit?”

Getting harder to find 32 bit systems any more.

“In my opinion, the safest Windows today is a Windows 2000, properly patched and with unneeded services shut down”

And it is pretty stable, I agree – but the safest? Server 2003 is pretty hardened out of the box and is really solid. 2008 looks like it is tougher yet – and yes, the UI-less version is still called Windows Yes, multiple changes to make it more specialised as a server. No idea why there is a 4 CPU limit on consumer operating systems – it was a commercial decision rather than a technical one, I think.

“Why is Win2000 safe? Well, because it's well known. The obscurity is gone, the exploits have been found and fixed and the holes are plugged”

XP is only 2 years younger and the number of recent security updates is around the same.

Only 26% of vulnerabilities had exploit code

23% of infections were caused by vulnerabilities, the remainder from social engineering

24 critical update for Windows 2000 in the last year

24 critical updates for XP in the last year

9 critical updates for Vista in the last year. Not bad for an SP0 product!

The figures don’t back you up, I am afraid

Blues
Reply With Quote
  #34  
Old 09 December 2007, 06:00 AM
Troberg Troberg is offline
 
 
Join Date: 04 November 2005
Location: Borlänge, Sweden
Posts: 11,580
Default

Quote:
Not an opinion that I expected from an advocate of open source. I disagree. We should provide customers with what they want as far as possible – and do what we can to make them safe. They may be ill-informed but they are adults, not children
I think we are more or less saying the same thing in different ways here. What I mean is something along the lines of this analogy:

I have a pain in my stomach. I go to the doctor and say "It hurts here. Do your stuff.".

I don't go to the doctor and say "I have appendicitis, you have to operate. Place the cut here, then (rest of detailed explanation of the procedure snipped)".

The users comes to us with their problem, we solve it.

Quote:
Again, we don’t get to make that kind of decisions for the nice customer. All we can do is offer them the choice.
That's a weak argument. You provide an unnecessarily unsafe solution, then claim that the user can choose if he wants to use it or not. The fact is that the average user have no idea about the risk or how to protect themselves.

Compare ActiveX to Java/JavaScript. ActiveX has unrestricted access to your computer's harware and your data/OS, while Java/JS runs in a safer sandbox. That's one example of how active content on the client side can at least be made somewhat safer.

Quote:
Which is why “side by side” and manifests were implemented. It was a problem. It still is sometimes with older components.
Side by side execution and manifests did not arrive until .Net. ActiveX is a pre-.Net technology and is still very much unsafe.

Quote:
Not in my experience or in case studies. Hardware and software files walls really do help. Most successful hacking attacks get around the firewall, most unsuccessful attacks flounder there. Firewalls protect very well against worms.
I'm not arguing that firewalls do no good, I'm arguing that several levels of firewalls give little additional benefit. If one level is breached, the next can also be breached. Of course, this is assuming that the same product is used with the same settings, but that's usually the case.

Quote:
Well, yes and no. Home users have a simpler topology in that it is local or it is internet – and most of their access is via TCP or UDP. A corporate LAN will have RPC and file sharing and active directory and cross domain calls with limited trust and all sorts.
That was true, but nowadays, home networks often contain 5-10 machines, VPN tunnels, domain controllers and so on. They are also often mixed environments, with different OS's and different hardware.

Look at my home network as an example. I have 15 or so Windows machines (mainly 2000, one XP and one 95 (old machine that is too weak for anything else)), about 10 Linux machines, 3 XBox, 1 XBox360, 2 PS2, 1 PSP, one Wii and 2 Novell servers (3.11 and 3.20). Most of this is on a copper network, but the PSP and partly the laptops (I'm trying to get them to use both wireless and wired network at the same time for performance reasons) are on a wireless. I can access my network from outside through a VPN tunnel. I run my own mail for security and practical reasons. I run two FTP servers for private use. I run two web sites (one of them currently down). I run a bunch of other internet services, such as a chat server, a voice chat server, and a couple of my own server products. I rely heavily on multimonitor, having 5 machines using it, the two most impressive machines have 6 and 4 monitors. I have two 24 Mbit internet connections, soon to be replaced by two 100 Mbit. All internet (except a few machines outside the firewall used for bittorrent) access is controlled by a dedicated machine running a dedicated firewall OS (Smoothwall). Servers I run inside my network include over 10 TB of file servers, a few database servers and a bunch of application servers. I also had a bunch of printers, but I recently threw most of them out as I never used them. I also have a couple of external networks connected through VPN so that a few trusted persons have access to my stuff and I can remote desktop them when their stuff doesn't work. My server room is air conditioned, my big servers are custom built to give sufficient cooling.

I work at a technical company with 50-60 employees making advanced software, yet my home network is far more complex and, imho, works better.

Sure, I'm probably somewhere in the top 80 percentile or something like that, but this is where the home users are heading. People outside Elbonia are not sitting with a single computer anymore, they are have more computers and want more out of them. Each family member have their own computer, game consoles, media centers, file servers, web servers, mail servers, home automation and so on is coming on a wide front. It's a big mistake to think of home networks as simple.

Quote:
Servers make poor bot hosts because they have more or less expert operators – and that is equally true of Linux or Windows.
Don't assume that it's a server just because it's running Linux. Linux, especially the *buntu-family, is taking giant steps forward on the desktop market now.

Quote:
We do have people working on security, standing by in case of a major incident, checking something all the time across the world. We even have drills as to what would happen if things went very bad indeed. You simply don’t get that with open source.
Nope, because when the source is open, the needed foundation for things going really, really bad is not in place. And, if it should happen, no one, not even MS, can match the army of developers available to fix it.

Quote:
Getting harder to find 32 bit systems any more.
It's harder to find 32-bit hardware, but not all systems are run on hardware bought yesterday. Also, a lot of people run 32-bit OS on 64-bit hardware, for compatibility reasons or because they happen to have a 32-bit license in a drawer somewhere and don't want to spend hundreds of dollars on a new license.

The one thing that I see that might quicken the switch to 64-bit OS's is the, in my decision immoral, decision MS has made to limit the amount of avialable memory on 32-bit OS's to 4 GB (minus the memory spaced used by PCI host and any graphicsa adapters). Why do I call it immoral? Because this is not a hardware limitation, they implemented that limit purely for promoting future operating systems. Hardware only limits the amount of memory available to each program to 4 GB, total memory limits are much higher. Linux, for example, can easily handle more memory on 32-bit hardware.

Quote:
No idea why there is a 4 CPU limit on consumer operating systems – it was a commercial decision rather than a technical one, I think.
Actually, it's a 2 CPU limit. I tried using Win2003 Server instead, but found that it had been crippled in other respects that makes sense for a server, but not for a workstation (mainly regarding graphics).

Quote:
XP is only 2 years younger and the number of recent security updates is around the same.
XP is getting close to an acceptable level, but it still has the dreaded mup.sys problem and is much slower, so I stick with 2000.

Quote:
23% of infections were caused by vulnerabilities, the remainder from social engineering
That's probably true, but technical solutions can, to a certain extent, protect against social engineering as well. For instance, the need to enter a password whenever doing something potentially dangerous in *buntu is a way of signalling to the user that "This might be dangerous, are what you are doing really something that should be potentially dangerous?".

Quote:
The figures don’t back you up, I am afraid
That depends. Bugs can, more or less, be counted as a fraction of the number of lines of code, the number used us usually 1 bug per 100 lines of code, 1 per 200 if you are an experienced developer or the code is trivial. Of course, most of these bugs are trivial, such as the "The menus appear disconnected from the window if you make the window really, really tiny"-bug that Windows suffers from, but the percentage of severe bugs is more or less constant.

If I recall correctly, Win2000 has about 8 million lines of code, XP 30 million and Vista probably a lot more. If the number of severe bugs is in linear proportion to the amount of code, we should have seen almost four times as many patches for XP as for 2000.
Reply With Quote
  #35  
Old 09 December 2007, 06:23 AM
ganzfeld's Avatar
ganzfeld ganzfeld is offline
 
Join Date: 05 September 2005
Location: Kyoto, Japan
Posts: 23,015
Computer

Quote:
Originally Posted by Troberg View Post
Just look at what the relatively benign viruses that struck IIS five years or so ago caused. My server (not running IIS) logged over half a million attacks from those viruses in two months. That probably meant that just about every IIS that wasn't properly patched was infected.
How could you tell whether that was two million different machines attacking you or just a few attacking over and over? You counted their IP addresses or something?
Reply With Quote
  #36  
Old 09 December 2007, 09:49 AM
BluesScale BluesScale is offline
 
Join Date: 29 December 2005
Location: Woolhampton, Berkshire, UK
Posts: 1,355
Default

Ah, Troberg, it seems that we will forever differ.

It seems that most of the areas that we are now disagreeing about are opinions rather than fact though there are a couple of things.

Side by Side came in with XP SP2 (project Springboard) and before that, you had .local files. Manifests are the only .NET technology used to allow precise versioning while unmanaged code uses other ways.

ActiveX controls in the browser run in restricted mode on all operating systems later than XP which was released in 2002.

Yes, there are many developers who could fix the major versions of Linux. However, that is actually a small part of the battle. Imagine the following scenario. A remote code execution bug was found by blackhats that allowed code to run as root on Linux kernel 2.4.0 for x86 and later and using a port that it critical and can not be blocked. It is used to create a botnet well shielded by rootkits and using grid principles in a similar way to Storm. It spreads rapidly in Eastern Europe providing a substantial base to infect other systems. The botnet (among other things) performs a DDOS on the main Linux sites. A large proportion of Linux systems are compromised over the course of 24 hours (and that is slow - Zlob took down Windows 2000 networks inside of 20 minutes)

Ok, we have a situation. You fix the code using your amazing coding skills in a moment. You have the sources to hand and the new builds are ready in 4 hours. Your buddies have tested them in another 4 hours. Smart work! Now, who is going to pick up the pieces, ensure that the ISPs filter the packets, handle all the the customer calls from people who have been compromised or think that they might have been, Assist customers to upgrade and assist with perceived regression issues and assist law enforcement tracing the attack and all these other things - especially when your own systems have also been attacked. Would that be Red Hat? Nope, they only own one version. SCO? Not their problem. IBM? Maybe they would help some users. Who would coordinate the effort? Linus? Does he ever need to sleep?

I am sorry but I don't see how the community could step up to the plate on this one. Sometimes you need something chunkier to solve a problem

Blues
Reply With Quote
  #37  
Old 09 December 2007, 11:34 AM
ganzfeld's Avatar
ganzfeld ganzfeld is offline
 
Join Date: 05 September 2005
Location: Kyoto, Japan
Posts: 23,015
Computer

I have never believed that Microsoft's OSs are as safe as other OSs, nor that they are only attacked often because they're "big targets". I think they're inherently not as secure as other systems. The ratio of attacks would still be high for Microsoft if all systems were equally insecure but not as high as it is. There are also a significant number of people who would like to hit those other systems because they are considered tougher to attack than Windows. We often hear people saying that Linux and Mac OS are just as vulnerable but the evidence doesn't support the claim.

It is also, of course, silly to think that there is such a large gap in security between Microsoft and other options that those of us not running Microsoft can relax and forget about security.

Even if Microsoft doesn't have a problem making its systems more secure, they certainly have an image problem in that area. So simply saying "They're insecure too!" is, in my opinion, not a very good strategy.
Reply With Quote
  #38  
Old 09 December 2007, 12:58 PM
BluesScale BluesScale is offline
 
Join Date: 29 December 2005
Location: Woolhampton, Berkshire, UK
Posts: 1,355
Default

Quote:
Originally Posted by ganzfeld View Post
Even if Microsoft doesn't have a problem making its systems more secure, they certainly have an image problem in that area. So simply saying "They're insecure too!" is, in my opinion, not a very good strategy.
I agree on both counts That is why we are fixing bugs as quickly as possible and improving the security of each new version. The debate was not whether MS software is secure enough - it isn't and there may be no such thing as secure enough for all conditions. That is why money is spent to improve it. The debate was whether MS software was a disaster and whether the solution to all security problems was to reformat all your machines and run Linux I don't think that it is in either case.

Blues
Reply With Quote
  #39  
Old 10 December 2007, 08:22 AM
Troberg Troberg is offline
 
 
Join Date: 04 November 2005
Location: Borlänge, Sweden
Posts: 11,580
Default

Quote:
How could you tell whether that was two million different machines attacking you or just a few attacking over and over? You counted their IP addresses or something?
Actually, I didn't. No I've done it (never throw out a log file...). it was 28322 distinct IP's, each of them trying a number of different approaches.

Quote:
Side by Side came in with XP SP2 (project Springboard) and before that, you had .local files.
OK, I missed that, probably because MS hyped it as one of "the big things" with .Net, so I assumed that it was a .Net thing.

It's not 100% though, as I still have problems running two exe files with the same filename but different content at the same time. Windows runs the first, then assumes that the second is the same and then runs on the image in memory.

Quote:
Yes, there are many developers who could fix the major versions of Linux. However, that is actually a small part of the battle. Imagine the following scenario. A remote code execution bug was found by blackhats that allowed code to run as root on Linux kernel 2.4.0 for x86 and later and using a port that it critical and can not be blocked. It is used to create a botnet well shielded by rootkits and using grid principles in a similar way to Storm. It spreads rapidly in Eastern Europe providing a substantial base to infect other systems. The botnet (among other things) performs a DDOS on the main Linux sites. A large proportion of Linux systems are compromised over the course of 24 hours (and that is slow - Zlob took down Windows 2000 networks inside of 20 minutes)

Ok, we have a situation. You fix the code using your amazing coding skills in a moment. You have the sources to hand and the new builds are ready in 4 hours. Your buddies have tested them in another 4 hours. Smart work! Now, who is going to pick up the pieces, ensure that the ISPs filter the packets, handle all the the customer calls from people who have been compromised or think that they might have been, Assist customers to upgrade and assist with perceived regression issues and assist law enforcement tracing the attack and all these other things - especially when your own systems have also been attacked. Would that be Red Hat? Nope, they only own one version. SCO? Not their problem. IBM? Maybe they would help some users. Who would coordinate the effort? Linus? Does he ever need to sleep?

I am sorry but I don't see how the community could step up to the plate on this one. Sometimes you need something chunkier to solve a problem
Well, for one thing, Linux sites are harder to target than Microsoft's site, as they are mirrored a lot and on extremely strong connections, often university connections. This makes it unlikely that the ordinary packet managers handling updates will be unable to handle the distribution and installation of the patch. ISPs have to solve their own problems, and are usually well equipped to do so, both in terms of technology, competence and keeping up to date with recent threats. Customers get support from the same channels as they usually get it, through various support partners they have deals with or through the internet. Coordinating the efforts? Well, given that the huge project that Linux is is coordinated, I doubt this would be a problem. The main problem in coordination would be that some redundant work would be done as many people attacked the problem, but in a situation like this, it's a good thing rather than a bad thing.

As for assisting law enforcement agencies, "hackers" have done a good job in doing that earlier, in fact, several cases has been brought to justice because "hackers" have more or less served the case on a silver plate to the police. Look at the German hacker who I can't remember the name of right now, or the recent sabotages by MediaDefender which were exposed by "hackers". Pissing of the elite in computer skills is an effective way of bringing down the police on yourself, much more effective than pissing off a large corporation which is percieved by many to be kicking on "the little people".

Quote:
It is also, of course, silly to think that there is such a large gap in security between Microsoft and other options that those of us not running Microsoft can relax and forget about security.
Of course, one always have to use common sense. Any system will have bugs, any system will have potential security flaws. That's unavoidable. It's a question of how many, how serious and how they are handled. Security is not a goal, it's a journey. I think both MS and Linux understands this, I just think that Linux is a bit further down that road.

The difference is that if they were houses, with Linux, you lock your door and then you are safe, with Windows you have to put up barbed wire, machinegun nests and a moat with crocodiles.

Then again, it could be argued, from a theoretical viewpoint, that open source is bug free, since the user interface is extended to include the source code. If I don't like the way Word autoformats my stuff, I can go into the settings and change that behaviour and it's not called a bug. When the source becomes part of the user interface, I can go into the source and change the behaviour so that it's not a bug. That's theory, though, and in practice, bug-like behaviour do exist even in open source.
Reply With Quote
  #40  
Old 10 December 2007, 09:36 AM
BluesScale BluesScale is offline
 
Join Date: 29 December 2005
Location: Woolhampton, Berkshire, UK
Posts: 1,355
Default

Quote:
Originally Posted by Troberg View Post
Then again, it could be argued, from a theoretical viewpoint, that open source is bug free, since the user interface is extended to include the source code.
I had no idea that your sense of humour was so sardonic

Blues
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is On

Forum Jump


All times are GMT. The time now is 01:28 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2017, vBulletin Solutions, Inc.