bug-glibc
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Allocate signal - __libc_allocate_rtsig


From: Stefan Hoffmeister
Subject: Allocate signal - __libc_allocate_rtsig
Date: Mon, 23 Apr 2001 14:07:49 +0200

What is the recommended way to allocate a signal for use within a
*library*?

Hard-coding use of SIGUSR1 and SIGUSR2 inside a library is a bit dangerous
- so calling __libc_allocate_rtsig to acquire a signal could sound like a
good alternative.

Alas, __libc_allocate_rtsig is not a public, and hence not officially
sanctioned - function.

I have found 

  http://sources.redhat.com/ml/libc-hacker/1999-02/msg00112.html

where Ulrich Drepper writes:

<quote>
On every system an application has to get by without the allocation
function [__libc_allocate_rtsig, StH]. It should only be used internally
since it will make it possible to adjust the SIGRTMIN and SIGRTMAX values.
The application will have to allocate numers relative to these values in a
way which does not lead to conflicts.
</quote>

So, theoretically, an application could pick 

  MySignal = SIGRTMIN+1;

But AFAICS this does not help, since a library may already have acquired
SIGRTMIN+1 for its own use. In a library/library situation matters get
even worse: The library can pick "something", but even if heuristics[*]
are employed, there is no guarantee (contract) that the very signal picked
is unique.

[*] I'll ignore the fact that having to run a heuristic on the signal
introduces a race.

To sum my problem up: Given the current public API, how can a library /
application atomically allocate a unique signal for its own, private use?
__libc_allocate_rtsig seems to be the magic bullet, but it is not public.

TIA!
Stefan



reply via email to

[Prev in Thread] Current Thread [Next in Thread]