gpsd-dev
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [gpsd-dev] [gpsd] Altitude in TPV


From: Gerry Creager - NOAA Affiliate
Subject: Re: [gpsd-dev] [gpsd] Altitude in TPV
Date: Fri, 2 Nov 2018 15:50:04 -0500

Yo! Indeed!
On Mon, Oct 29, 2018 at 5:17 PM Gary E. Miller <address@hidden> wrote:
Yo Gerry!

On Mon, 29 Oct 2018 16:50:00 -0500
Gerry Creager - NOAA Affiliate <address@hidden> wrote:

> > The u-blox NEO-M8T reports psuedorange, carrierphase and doppler for
> > GPS, SBAS and GALILEO on the L1 band.  I can't get the M8T to output
> > anything for GLONASS.  That should be enough for RINEX 3.
> > 
>
> Is pseudorange rate in there, as well? If we've got range and phase,
> we can approximate rate and that'd give us a good shot ad solid 3d
> postprocessing...

I'm unfamiliar with 'pseudorange rate'.  Is that the same as doppler?
RINEX just wants pseudorange, carrierphase and doppler.  Doppler is the
speed that each satellite is approaching or receding.

PRR is indeed doppler, expressed in m/sec. In the world I grew up in, doppler was a frequency rate change in hz/sec... Yes, I'm old.

> Single-freq data didn't provide sufficient information in carrier
> phase to resolve either horizontal or vertical, save for RTK, and RTK
> never quite got vertical right although it was better than autonomous
> code phase L1 only. L5 was added to allow a civil code and carrier
> capability. L1C/L2C has mitigated that to a great extent. A lot of
> people have tried to make survey-quality results work with L1/code
> only, and have reported errors unsupported by repeats of their
> studies in an objective manner. You can get close, but it's not
> geodesy at that point.

I'm looking forward to the u-blox 9 series with its multiple frequency
support.  That is why I'm trying to update gpsd for what that needs.

Cool! Looking forward to that!
 
> > https://www.ngs.noaa.gov/GRD/GPS/DOC/pages/pages.html
> > FORTRAN 77.  Wow.  Serious stuff.
> Easy enough even a scientist can learn it.

Something to look at after RINEX support for gpsd.

> > Interesting.  Worth some testing.  Doing 4 hour data captures a lot
> > easier than 24 hour runs.
> It's actually in the makeup of the constellation. The geoid and
> ellipsoid are straightforward. There's plenty of computational
> horsepower and memory for those datasets and calculations in most
> receivers. I can't speak to the dongles, but even when I started
> playing with "low cost" board level GPS, the processors were
> hard-core and had lots of memory for the time.

Since we can't presently change the GPS receiver firmware, it would
have to be done in gpsd, or a gpsd client.  Easy to do, if gpsd
has the needed data and algorithms.

It's probably marginally do-able in gpsd, but is it worth it? PAGES, for example, is stand alone, reads RINEX, and computes the geodetic results as an application. Is it useful to overburden gpsd to add the functionality?
 
> I'll have to find the paper I wrote. Gotta have it somewhere. It
> was  enough to get NGC to create NGS-58 at the time.

Wow, 2 cm accuracy 95% of the time, in 1997.

With a lot of observation, resolving multiple baselines and doing least-squares adjustments. But, yes, it was do-able.
 
> > Wow, if it takes days of measurements to average then the post
> > processing of short data sets looks a lot better.
> Over time, sure. That said, 10 or so hours decimated will give you a
> good first guess, and you can augment it with another dataset, or 3,
> later improving the estimate. Just remember that your error is a
> least-squared error instead of simple standard deviation.

That is why gpsprof and ntpviz compute skewness, kurtosis, and much
more.  To clearly show that the data is not even close to a standard
distribution.  That has been misleading people for decades.

Any idea how long I've been preaching that mantra? My first paper on horizontal accuracy and achieving same computed skewness and kurtosis and the editors ripped hard-core statistics out as too much information/detail.
 
> > I guess too much to ask for that code to be put in gpsd?
> Depends on how we want to go about it. The decimation approach is a
> bit memory expensive but not too bad. Or write it to ramdisk or a
> temp file and process from there in a 2-step fashion.

Storage is cheap now.  I just looked at 512 GB SSD for $80.  Run time
not an issue since the raw data collection time already many hours long.

Sorta how I'm thinking. That's probably the best first-pass approach, is fast-epoch data collection, judicious decimation, and then averaging. Two strategies for the averaging are an average of the whole decimated dataset, or to process twice, removing, for the second and final average, obs where any coordinate is > 2SD out from the mean.
--
Gerry Creager
NSSL/CIMMS
405.325.6371
++++++++++++++++++++++
“Big whorls have little whorls,
That feed on their velocity; 
And little whorls have lesser whorls, 
And so on to viscosity.” 
Lewis Fry Richardson (1881-1953)

reply via email to

[Prev in Thread] Current Thread [Next in Thread]