gnumed-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Gnumed-devel] GNUmed (debian) servers and security


From: Karsten Hilbert
Subject: Re: [Gnumed-devel] GNUmed (debian) servers and security
Date: Mon, 28 Jan 2008 16:32:26 +0100
User-agent: Mutt/1.5.17+20080114 (2008-01-14)

On Sun, Jan 27, 2008 at 10:32:29AM -0800, James Busser wrote:

> 1. The server needs adequate physical protection. Even if the room in  
> which it resides can be accessed by thieves it would be good to have  
> some additional physical lockdown of the machine. I understand that it is 
> not unusual for thieves to bring boltcutters with them, therefore special 
> hardened chain that cannot be severed with bolt cutters be must instead 
> be cut with a grinder may be better for this situation.
Depending on time and dedication they might remove the hard
drives from the server leaving the casing in place.

There's casing available which shuts down the machine and
sets off alarms when tampered with.

> 3. The server medical data (Postgres cluster for GNUmed, dumps,  
> downloaded HL7 messages etc) should live on an encrypted partition.  
> Truecrypt seems to have become the standard for multi-OS encryption but 
> its license does not qualify for direct Debian distributions. Is it still 
> wiser / better to use it, over (say) cryptmount?
Sounds reasonable.

> 4. Access to the database. Should Postgres and the machine it is sitting 
> on be somehow better-protected behind some other machine, or it is 
> somehow acceptable for this machine to be connected to the  
> router/internet.
I would recommend against having the database machine
sitting directly on the internet.

> Is there anything about this set-up that needs to be  
> carefully considered? It seems to me that the fact that Apache/Tomcat  
> serve Oscar's MySQL data was used as a strength maybe because Apache's 
> security has been well-tested whereas in our case if Postgres is directly 
> serving the data are we in a less-well tested environment?
PostgreSQL is just as well tested but it is risky to have
the raw data files sitting on a machine directly connected
to the internet. An attacker could gain access by means
*other* than PostgreSQL and then connect locally or even
copy the raw files and access them elsewhere. It is Good
Practice to separate internet -> gateway access from gateway
-> database server access physically.

Hacking the gateway will not give direct access to the data.

> 5. Data integrity / safety (backups). We previously discussed the need to 
> have regular (including offsite) backups. Personaly I will be happier to 
> see an actual backup/restore cycle tested
Well, using the scripts we provide I have dumped and
restored successfully before -- but you need to have
procedures tested locally anyway.

What I do on GNUmed servers I take care of is to dump the
database once daily, tar/bzips/notarize the dumps and then
scp them to another machine. Both the server and the other
machine include the dumps in their own regular backup.
Monthly, backups are moved to CD-ROM kept by the user. Thus
there can exist up to six copies of the data.

No encryption is done so far (apart from on-the-wire with scp).

> and showing that data that was 
> known to exist! For sites which did not want to suffer downtime, a 
> secondary server would be a good idea. Did Karsten or anyone else ever 
> establish slave/replication services on their database(s)?
Not that I know of. However, just for dumping the database
there is zero downtime. The replica machine would only
needed when the primary server gets hosed somehow.

Karsten
-- 
GPG key ID E4071346 @ wwwkeys.pgp.net
E167 67FD A291 2BEA 73BD  4537 78B9 A9F9 E407 1346




reply via email to

[Prev in Thread] Current Thread [Next in Thread]