-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Finding Dupes #1
Comments
@pedromorgan hi Pete, yes, you are correct, the timestamp, the 2nd last CSV field, So to break the current CSV raw logs back into each json fetch, you can just switch on the Normally, each json block will be some 5.5 seconds since the last, except if there has been a And of course each CSV line commences with the UNIQUE Flight ID assigned by crossfeed... And the static html pages, like http://htmlpreview.github.io/?https://github.com/fgx/crossfeed-dailies/blob/gh-pages/html/20160806.htm shows how I broke the logs into Models and Callsigns... Of course the callsign can be a crazy string like And Theo has produced a great 3D global view of each CSV file here - http://fgx.github.io/sandbox/globe-crossfeed-replay/ Hope this answers your question here? There are no duplicated records... |
So the way its working atmo, is that the cvs is imported into a statgin database table.. and drom then processed into the various datatables, callsign, aicraft, flight (includes path as geom) Check here https://pg.daffodil.uk.com/ SO that way, we can actually check for dupes (probably and unique index of fid+callsign+aircraft+ts) |
@pedromorgan your link requires authentication, so not sure what you are pointing to... I do a similar thing in cfcsvlogs.pl, to pre-process the CSV log, first into FIDs - that is a unique set of flight records for that FID... Naturally for any one FID there will also be ONE callsign, and one MODEL, repeated in each record, with a set of lat,lon,... over time... updates... To re-state - there are no duplicate records... do not understand this idea of |
From my end.. I am banging the files into database.. even the same file On 28 Aug 2016 7:27 p.m., "Geoff McLane" [email protected] wrote:
|
@peteffs ok, well yes, if you are just banging files... even the same file more than once... Then yes, the SAME But you could just as easily keep track of CSV files done... |
There is a problem where we have raw logs etc
But also missing pieces, or an attempt to reinsert same data etc..
So question to @geoffmcl are..
Main thing here is timestamp and I assume thiat is from the
last_updated
json filed from cf.ffs ?The text was updated successfully, but these errors were encountered: