dslreports logo
site
 
    All Forums Hot Topics Gallery
spc

spacer




how-to block ads


Search Topic:
uniqs
9
share rss forum feed


Blackbird
Built for Speed
Premium
join:2005-01-14
Fort Wayne, IN
kudos:3
Reviews:
·Frontier Communi..

2 recommendations

reply to daveinpoway

Re: Think layers of security is all that? Think again

I consider the human as the most important security layer for any system. A user's customary practice sets a number of important and even "absolute" barriers to entire classes of infectors. Granted, just like with anti-malware programs, there are differences in humans and their security practices... some succeed better than others against the universe of malware. For example, if I were never to install and use Flash, all Flash exploits would be excluded from the cloud of malware that could potentially infect my system. Likewise, if I perpetually disabled JavaScript in my browsing, all JavaScript-related infectors would be excluded. On the other hand, if I disabled JavaScript for many sites, but not all, then I would be slightly opening the door to potential JavaScript-related infectors.

I am convinced that the human user remains the first line of defence against malware... where he goes, what he does or opens, how he does it, settings he enables/disables, all kinds of choices he makes - all play extremely strong roles in keeping (or not keeping) a system free of malware. All of which makes the knowledgeable human user a very powerful component of layered security. But the article, as does much of the discussion of this subject, ignores humans as such a major, perhaps the strongest, layer.

At the end of the day, there will always be theoretical exploits that could penetrate any system and set of user practices. But the concept of layered security reduces the statistical system infection possibilities by compounding the low individual possibilities of infection passing through each layer (including human) to create an extremely low aggregate possibility.
--
“The American Republic will endure until the day Congress discovers that it can bribe the public with the public's money.” A. de Tocqueville



StuartMW
Who Is John Galt?
Premium
join:2000-08-06
Galt's Gulch
kudos:2

I think many, even those in security fields, simply fail to ask simple questions.

The purpose of malware, any malware, is for one human (or group) to gain what another has be that data, money, access etc.

Thus the two first questions should be: What? and Why? Once those have been identified you then ask "How?.

The answers to those question are often different for different individuals and organizations.

PS: Imagine I set up up some network with PC's, firewalls, A/V and so on but no-one (i.e. no people) uses it. Who would bother to hack into that system? What purpose would it serve to do so?
--
Don't feed trolls--it only makes them grow!



Woody79_00
I run Linux am I still a PC?
Premium
join:2004-07-08
united state

1 recommendation

reply to Blackbird

Very good points Blackbird! The human behind the controls is the number one determining factor.

Just some food for thought, but I think "The Law of Diminishing Marginal Returns" is something that is overlooked by the layered security approach and its advocates.

I think its plausible that at some point we need to ask ourselves "at what points does adding additional security layers actually begin to generate negative returns"

As with all things, cpu cycles, power, etc are not free. So I do think how many and which layers a person uses should be taken into consideration along with the complexity each layer adds to the system. Exploits in security products such as the ones in the past with Trend, McAfee, Symantec, and others with their AV engine should also be considered.

I personally have moved to the following security model:

1. Whitelisting via Software Restriction Policies --- Administrator approval (A password only I know) is required to run any executable outside of the Windows or Program Files Directories that I haven't explicitly whitelisted...simply put they won't run. It has virually no overhead, takes about 20 minutes to set up, is easy to learn, and stops most potential problems.

2. EMET --- I use Enhanced Mitigation Expereince Toolkit to force all my programs to run under DEP, ASLR, and SEHOP, and other such program hardening rules. Requires no real overhead, its not too hard to setup, and just works.

3. I run 1 real time security product...in this case Vipre because it was cheap. It works, has built in firewall, does its job, which is not really much considering i practice safe hex and nothing seems to get past 1 or 2....especially since all non-whitelisted executables require admin approval with a password to even execute.

4. I scan with Malwarebytes once a week. It never finds anything.

I feel in terms of The Law of Diminishing Marginal Returns...this is the best setup for "me". As adding any other layers would not yield enough of a security benefits for the costs in time, and hardware to be worth it which would make the returns on the investment dwindle down too far towards the negative side of the scale for my likings...not enough benefit for resources expended both real in human and hardware.

Again everyone's needs and system requirements are different. I doubt anyone is going to have the same set up if they think out and design their own plan.

I do think it is prudent to protect yourself in the best and most efficient manner possible. That is going to be different for everyone of course. However, I also believe with layers it can be overdone and using too much results in too little.

I think this type of efficiency is a conversation worth having among professional like all of us fine folks who frequent these forums. I also think whitelisting is something more home users should learn and take advantage of.