Wednesday, May 21, 2008

How to allow Internet access for research and protect servers at the same

Q: How do we allow Internet access for research and protect our servers at the same time? As a programmer analyst I use the Internet to do research for my work. We are looking for a way to still allow this but protect our servers where client info is held. We don't want to be constantly updating a trusted sites list with website addresses that our programmers use. Any suggestions would be greatly appreciated!

A: Hi ....,
As recommended in the previous posts, the first action is to follow a comprehensive security framework to make sure that all high level security concerns are addressed with appropriate security controls and information security governance is in place. (You can choose NIST, COBIT, ISO 27001 etc)

When the policies and procedures are addressed, on the architecture side I have several recommendations:
1- Segment your servers from research desktops. Servers have to be behind security controls like intrusion prevention, firewall, data leakage protection, database security, behavioral anomaly detection etc.
2- Secure your desktops and servers with client based firewalls, host based intrusion prevention systems, 802.1x, anti-virus, anti-malware, encryption, group policies, tripwire, application control, secure IM etc. Always secure your client data, use rights management tools, database security tools, encryption tools and audit tools on your servers where you keep client data, make sure that every security policy is enforced. Strong authentication is also recommended. Make sure that you control every output device like USB tokens, CD writers, backup tapes etc.
3- Filter your internet http/https access with anti-malware, anti-virus, instant message protection in addition to URL filtering, you can use a web based filtering service like ScanSafe or you can go with an in-house solution like BlueCoat, Websense, SecureComputing etc type of URL + AV + IM filters.. Deny outbound tunnels on external gateways. (which is a very common way for programmers to bypass filters…)
4- And for your programmers… Managing white-lists (trusted sites) is painful. Use a self-service management gateway for your URL filtering so that the users can request the sites that they want to access for limited time, and the approval process is automated. The way it works is to have a web portal where end users request for new trusted sites for research. Approval can be automatic or manual, unlimited or limited, user restricted or for all. You can integrate this portal with your internal user directory such as active directory or LDAP, so that portal can recognize each requestor without additional login

Let me know if you have any specific question about the any of the solution areas that I have briefly mentioned.
cheers,
- yinal

Type rest of the post here

No comments: