J-Field Suggestion

Darren Clemons

Well-Known Member
I almost commented on Darren's remarks yesterday, but decided not to bother. However, since they might show up in a publication soon I'll put in my 2 cents:

I agree with the general sentiment, but I disagree that the only thing that SMI and SurvCE give you is coordinates. There's a wealth of information about the field experience in the raw data files those applications create.

I've been using one or the other application for 27 years, and I rarely even look at the data collector coordinates in the office. Instead, I process the raw data file for use in adjustment software (Star*Net being my weapon if choice). And whenever something doesn't look right, I go right to the raw data file to see what actually happened.

My point here isn't to criticize Darren, but rather to alert Patrick to a perceived vulnerability in a blanket statement that he's considering for publication.
Jim, I didn't intend for the "just coordinates" comment to sound as blanket as it did.
Let me expand a bit.

Of course there are mounds of data in all the raw data files we processed of whatever software we've all used for years (29 for me), but no matter what processing, adjusting or least squares we did with that raw data at the end of the day we still ended up with just coordinates.

And I'm sure we all remember just how awful it was to dig through and make sense out of the pages and pages of data - especially when we started printing the raw files from TDS on our Topcon GPS files.

My point was simply with the LS and it's wonderful PDF reports, which are very easy to read through and very well organized, that we can simply "see" so much more than we ever could before on what our field crews did.
Again, at the end of the day with the LS we still end up with just coordinates but there's no question we can understand how those coordinates were stored and collected so much better.
 

Nate The Surveyor

Well-Known Member
I know I'm dreaming...
But, wouldn't it be nice, to have a little button, in carlson, at each node, zoom into the node, click the button, and the pdf for that point appears.., 5 yrs from now, it's just a part of the dwg file...
Dreaming...
 

Patrick Garner

Active Member
Jim, why Star*Net instead of using Cluster Points? If I had taken 3, 4 or 5 shots on the same corner, then used cluster, I'd be leery about then shifting the location based on least squares. Or am I missing something?
 

Jim Frame

Well-Known Member
Each RTK position stored has a unique error estimate associated with it. These are reflected in the cross-correlations or covariance matrices of the g-file. Using these individual error estimates in an adjustment assures that each position is weighted appropriately. Cluster averaging, as I understand it, assumed that each stored position is of equal quality, so a strong position can be contaminated by a weak solution in an arithmetic mean solution. But if I'm mistaken about this, I welcome corrections!
 

Adam

Well-Known Member
5PLS
Each RTK position stored has a unique error estimate associated with it. These are reflected in the cross-correlations or covariance matrices of the g-file. Using these individual error estimates in an adjustment assures that each position is weighted appropriately. Cluster averaging, as I understand it, assumed that each stored position is of equal quality, so a strong position can be contaminated by a weak solution in an arithmetic mean solution. But if I'm mistaken about this, I welcome corrections!
Cluster averaging is a weighted average.
 

Jim Frame

Well-Known Member
Cluster averaging is a weighted average.

Thanks for that clarification. The g-file route becomes less important to weighting, and more of a convenient way to get that weighting into a least-squares adjustment. Since almost all of my RTK projects also involve terrestrial and/or static observations, it's still my preferred approach.

It's also aids in assessing the accuracy of the assigned RTK weights. In general, I find that the J-Field error estimates have to be inflated by around 50% to jive with terrestrial and static measurements.
 

Phillip Lancaster

Active Member
When I was using Carlson data collection. After I imported my points by just .crd instead of .txt. It would give me the "info" on each gps shot. I guess its in the .crd or the raw file. It would tell me the sats used and H&Z of each point. Is this what you are talking about Nate? By getting that information into Carlson with the LS data. If not it would be cool if Javad could get that to work.
 

Phillip Lancaster

Active Member
example.jpg
 

Shawn Billings

Shawn Billings
5PLS
Thanks for that clarification. The g-file route becomes less important to weighting, and more of a convenient way to get that weighting into a least-squares adjustment. Since almost all of my RTK projects also involve terrestrial and/or static observations, it's still my preferred approach.

It's also aids in assessing the accuracy of the assigned RTK weights. In general, I find that the J-Field error estimates have to be inflated by around 50% to jive with terrestrial and static measurements.

When mixing terrestrial traverse and GNSS vectors, Least Squares is the only way to fly. I used Columbus for a long time and now I use Carlson on the rare occasion that I need to combine total station observations and GNSS vectors. I still think Columbus is the best, but it was a steep learning curve that I had to learn every time I used it. Carlson is a bit simpler if not quite as exhaustive as Columbus.

This thread describes a recent project with cluster averaging and Relative Accuracy reporting all done within the Triumph-LS - https://support.javad.com/index.php?threads/triumph-ls-cluster-average-and-relative-accuracy.2619/
 

party chief3

New Member
Jim, I've used all three and it applies. I loved how SMI never really forgot any thing you did like translating or rotating or correcting HR's in raw file for re-processing in the field.
 
Top