gnumed-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Gnumed-devel] guidline routed diagnostic in GNUmed


From: Philipp Walderdorff
Subject: [Gnumed-devel] guidline routed diagnostic in GNUmed
Date: Tue, 19 Sep 2006 20:12:06 +0200
User-agent: KMail/1.8.2

In GNUmed's field "Patient Request" I can enter the patients request 
individually and intuitive, like our diagnostic work is going on. This is 
what every GP is doing in daily work.

The other possibility could be: From here ub to the Field "Findings" the 
diagnostic strategy could be guided by a guidline-plugin, which leads to the 
Assessment.  I do not mean a diagnosis automatism. 

This Plug-in-programm, that could do that job would be a big project of its 
own.

Because the docters have various ways of thinking and because the diagnostic 
in General medicine is complicated (a never ending discussion-process)
an adequate application will be a very big project.

Nevertheless I am dreaming since years to get it realized.

A guidline-programm for that purpose needs a data input mode, which does not 
disturbe the doctor/patient contact.

The quickest way of data input in medicine is via keyboard, without mouse.
But this means: both hands on the keyboard, concentrated on tipping into the 
keyboard. That is OK after having done my diagnostic chat with the patient. 
The patient does accept this interruption for administrative means.

But hacking into the keyboard  while speaking to the patient, with both hands 
on the keyboard, having a glimps over the shoulder to the patient is'nt what 
Balint would have done when speaking to the patient.

So I think, for that part of dhe docter-patient-contact the better way of 
entering datas from the diagnostic chat into GNUmed would be to only (or 
predominantly) use the mouse.

Do we need to enter data while doing the diagnostc/therapeutic chat? 

If we want our diagnostic chat beeing guided through a program, we have to 
enter data direct in front of the patient. 
This has to be done in a way, which does not or very few disturb the chat.

How can we use the mouse to move thum-nail images of the symptoms of the 
patient's request on to an image of the human? It is not possibel in that 
simple way. But if we could divide and organize that process I have the idea, 
that we could manage it.

This is, what I would like to test.  

The next question is: how can we provide guidance through the diagnostic 
process without a steep and inflexible sequence of question?

How could it be that GNUmed knows now which filds I need to enter now?
Which question have to be asked?

When we are able to enter data while speaking to the patient, in a way, that 
does not disturbe the doctor-patient-contact, the Gnumed-programm could be 
able to think with us. Gnumed could be able to acompanay us in this diagnostic 
process (=Guidline). Gnumed should be able to offer fields to take up data 
whenever we need them. Like a good nurse during operation, knowing which 
instrument the docter will need now.

A huge database would be necessary to save these diagnostic gatewayes.
A self-learning programm would be necessary to learn these informations.

Certenly this is a future project.

But what I would like to start with now is to test, if the first approach is 
possible.   

As I am new in python and object oriented programming, I am not able to start 
this project on my own. Not now. But may be later :-)

My question is: Does anyone know python source-code, that has realized to put 
graphical thum-nail-objects (could be the graphical sign for pain in our 
example) to some graphical surfice (could be  the image of human beeing in 
our project). By positioning this object at a given position the following 
programmflow will depend on what is spicified on that position point. That 
means: The object "pain" will behave different on different locations.

I would like to play with such a initial test-programm to see, if we can find 
a way of concentrating the many possible question at the beginning of a 
diagnostic chat to some few symboles.

And I would like to learn object oriented programming with this example.

Is this how the pink panther is thinking? Unrealistic? May be.
But I am a phantast. And optimist. :-)

Philipp Walderdorff





  


















  







reply via email to

[Prev in Thread] Current Thread [Next in Thread]