gnumed-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Gnumed-devel] guidline routed diagnostic in GNUmed


From: Karsten Hilbert
Subject: Re: [Gnumed-devel] guidline routed diagnostic in GNUmed
Date: Thu, 21 Sep 2006 00:16:04 +0200
User-agent: Mutt/1.5.13 (2006-08-11)

On Tue, Sep 19, 2006 at 08:12:06PM +0200, Philipp Walderdorff wrote:

> In GNUmed's field "Patient Request" I can enter the patients request 
> individually and intuitive, like our diagnostic work is going on. This is 
> what every GP is doing in daily work.
> 
> The other possibility could be: From here ub to the Field "Findings" the 
> diagnostic strategy could be guided by a guidline-plugin, which leads to the 
> Assessment.  I do not mean a diagnosis automatism. 
> 
> This Plug-in-programm, that could do that job would be a big project of its 
> own.
And this plugin program already exists. It is called EGADDS.

The problem with this is not the technical implementation
but rather the acquisition of *content*.

> The quickest way of data input in medicine is via keyboard, without mouse.
> But this means: both hands on the keyboard, concentrated on tipping into the 
> keyboard. That is OK after having done my diagnostic chat with the patient. 
> The patient does accept this interruption for administrative means.
I am typing progress notes (of medium verbosity) *daily*
with *every* patient. I am typing before and after the
consultation. I am also typing in some things inbetween. A
trained typist can easily capture the Subjective part of the
progress note while the patient is getting up from the
chair, undressing and moving over to the exam chair (?).

No patient has ever been upset about that. Now, that's true
for Leipzig, East-Germany.

> But hacking into the keyboard  while speaking to the patient, with both hands 
> on the keyboard, having a glimps over the shoulder to the patient is'nt what 
> Balint would have done when speaking to the patient.
In my exam room the patient is sitting across the desk (I
move around to the side if I feel that's better for getting
through to the patient). I am looking at the patient *while
typing*. I can do that because it is possible to always know
the internal state of the input application.

> How can we use the mouse to move thum-nail images of the symptoms of the 
> patient's request on to an image of the human? It is not possibel in that 
> simple way. But if we could divide and organize that process I have the idea, 
> that we could manage it.
> 
> This is, what I would like to test.  
It sounds very interesting !   Please, do try to implement
it so we can all assess whether it works out.

> The next question is: how can we provide guidance through the diagnostic 
> process without a steep and inflexible sequence of question?
> 
> How could it be that GNUmed knows now which filds I need to enter now?
> Which question have to be asked?
> 
> When we are able to enter data while speaking to the patient, in a way, that 
> does not disturbe the doctor-patient-contact, the Gnumed-programm could be 
> able to think with us. Gnumed could be able to acompanay us in this 
> diagnostic 
> process (=Guidline). Gnumed should be able to offer fields to take up data 
> whenever we need them. Like a good nurse during operation, knowing which 
> instrument the docter will need now.
>
> A huge database would be necessary to save these diagnostic gatewayes.
> A self-learning programm would be necessary to learn these informations.
Again, the problem isn't the implementation. The problem is
to get enough useful *content* to make it work.

> But what I would like to start with now is to test, if the first approach is 
> possible.   
Sure, feel free to start hacking !

> My question is: Does anyone know python source-code, that has realized to put 
> graphical thum-nail-objects (could be the graphical sign for pain in our 
> example) to some graphical surfice (could be  the image of human beeing in 
> our project). By positioning this object at a given position the following 
> programmflow will depend on what is spicified on that position point. That 
> means: The object "pain" will behave different on different locations.
> 
> I would like to play with such a initial test-programm to see, if we can find 
> a way of concentrating the many possible question at the beginning of a 
> diagnostic chat to some few symboles.
> 
> And I would like to learn object oriented programming with this example.

You could try to google for "Andrea Gavana" and
"ThumbnailCtrl". You need to learn about FloatCanvas and
friends.

> Is this how the pink panther is thinking? Unrealistic? May be.
> But I am a phantast. And optimist. :-)
When I started working on GNUmed I had not idea about SQL.

Karsten
-- 
GPG key ID E4071346 @ wwwkeys.pgp.net
E167 67FD A291 2BEA 73BD  4537 78B9 A9F9 E407 1346




reply via email to

[Prev in Thread] Current Thread [Next in Thread]