-
-
Notifications
You must be signed in to change notification settings - Fork 64
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
1746 Add SR CSV download for Address and Count #1752
Conversation
Hi @kdow, welcome and thanks for your hard work! I'm addressing the question regarding the table name querying: Background Dynamic Queries Year extraction implementations can be found in Consider Cross-Year Dates Extra Info
I hope this explanation helps. Please let me know if you have any questions, I'd be happy to help! |
That does help, thanks for the thorough explanation! I'll make that change in a revision today or tomorrow. |
.github/ISSUE_TEMPLATE/blank_epic.md
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I know you said to disregard, but I think this can be resolved if you pull latest from main
, switch to your branch, and merge main
into your branch. We aren't being picky about merge commits -- although let us know if you believe there is a concern with this approach
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was able to merge another newer commit from main without it adding but I wasn't able to push my changes without pulling from my branch first and it got added in. I can try fixing it again in another commit or closing this PR and opening a new one.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, so sorry about all that. I was using github to create a new template and didn't quite see the harm in pushing directly to main. I'll just stick to the usual process to avoid this in the future.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think one last option would be for you to run git rebase main
and resolve the merge conflicts by accepting the current changes (aka choosing your changes each time). Let me confirm this by running it locally to see if it prunes out that extra commit.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok I've attempted it on a branch that derives from your branch....
- on branch
main
rungit pull
- switch to your branch,
1746-sr-csv-download-address-count
- run
git rebase main
- There will be a first merge conflict
- address it by accepting all instances of your changes
- save the file
- Run
git add .
to keep those changes - Run
git rebase --continue
- There will be a second merge conflict
- repeat steps above
- You might get a warning about "The previous cherry-pick is now empty..."
- you can run
git rebase --skip
to conclude the rebase
- Run
git push
- You'll see that you only have a few commits, since we've pruned my commit, plus a couple of your repeat commits/merges
Here's my end result on a demo PR: #1766
AND RequestType IN (${formattedRequestTypes}) | ||
GROUP BY Address`; | ||
} | ||
return `SELECT * FROM requests_${startYear} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would either convert this logic block to explicitly be if-else, so readers can see startYear === endYear
and grouped
are the deciding factors between which logical pathway is taken. OR I would add comments indicating what the state of logic should be...
i.e. here on line 66: //here, our data comes from the same year and we wish to obtain all rows that match the filters
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
While we're considering comments, here's my other request for adding a note: #1746 (comment)
Hi @kdow, just wanna let you know that I didn't forget about reviewing this PR. I'm just waiting to see what you think about Ryan's requested changes. Thanks. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kdow Thanks for your great work on parameterizing the query table name.
Pulled down the PR and tested locally. I uncommented the ExportButton
in FilterMenu.js
and downloaded the csv.
The columns that suppose to have date values in date format, such as CreatedDate
, UpdatedDate
, ServiceDate
, and ClosedDate
columns, appeared to be timestamps in milliseconds. See screenshot below:
In this example, the CreatedDate
column for the first row value is 1718226384000
, which is timestamp (milliseconds since the Unix epoch) for 2024-06-12 21:06:24
.
We need to convert the timestamps into human-readable date format before exporting to CSV.
const neighborhoodDataQuery = generateQuery(); | ||
const neighborhoodDataToExport = await conn.query(neighborhoodDataQuery); | ||
const neighborhoodResults = ddbh.getTableData(neighborhoodDataToExport); | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would do the timestamp-to-date conversion here before passing neighborhoodResults
to Papa.unparse
.
Something like
const formattedResults = neighborhoodResults.map(row => ({
...row,
CreatedDate: row.CreatedDate ? moment(row.CreatedDate).format('YYYY-MM-DD HH:mm:ss') : null,
UpdatedDate: row.UpdatedDate ? moment(row.UpdatedDate).format('YYYY-MM-DD HH:mm:ss') : null,
serviceDate: row.serviceDate ? moment(row.serviceDate).format('YYYY-MM-DD HH:mm:ss') : null,
}));
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Need to include ClosedDate column in there
const neighborhoodDataToExport = await conn.query(neighborhoodDataQuery); | ||
const neighborhoodResults = ddbh.getTableData(neighborhoodDataToExport); | ||
|
||
if (!isEmpty(neighborhoodResults)) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
then pass formattedResults
here instead of neighborhoodResults
, same for Papa.unparse() below.
329c588
to
f88cdf2
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the PR is looking clean, and the final commit on formatting the date output in the csv looks good too. Approved.
Hi @kdow, thank you for cleaning up the timestamp transformation, the date columns are now in correct date format. However, I encountered some other issues while testing different date ranges, with the number of rows/SRs in the CSV inconsistent with the number of dots/SRs visually showing on the UI map. See screenshots below. Take "Downtown LA" date range 2/1 to 2/15 for example, when I clicked the This was tested with several different districts and different date ranges. Screenshots of inconsistent result examples
Sun Valley Area NC 02/01/2024 to 02/15/2024
Downtown Los Angeles 05/01/2024 to 05/31/2024
Downtown Los Angeles 02/01/2024 to 02/15/2025
Given the complexity of the issue, I don't have enough time during this PR review to fully debug the problem and was not able to pinpoint where the issue is. Let me add comment later today for suggestions on steps to investigate that may be able to help you. Also looping in @ryanfchase @aqandrew @bphan002 to get fresh insights that could maybe help us identify or pinpoint the problem. P.S. Sorry if y'all were getting multiple email notifications from me editing this comment, I'm very bad at writing and usually require multiple edits |
Availability for PR Review Friday 6/20/24 I looked at this briefly, but will need more time to look into it more on Friday. @ryanfchase @Skydodle |
I also got that alert message for some of the export request even though there should be data. I think this is the issue and here is my test case. For NC United NEIGHBORHOODS NC I noticed the value in the formatted requestType is 'Single Streetlight', but the actual RequestType should be 'Single Streetlight Issue' My guess is that there may be several requestTypes that are not mapping correctly to the actual names of the request. Since the names do not match exactly, the SQL query is filtering them out and causing the alert message to appear. Other examples such as 'Metal Appliances' should be 'Metal/Household Appliances' instead. I'm sure there are more that are not mapping to the correct names. One solution I can think of is to create a hashmap for the ones that are not mapping correctly. Here are some side findings. Even though there should be 5 rows I got 9. I noticed there the same request has been created within a span of a couple of minutes so the data is overlapping. I'm not sure if we want this filtered out in the future. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Map Request Type to the exact string to see if that solves the issue
Thanks all for helping to find and figure out some of these issues. I've updated the requestType names and createdBy end dates that were effecting the results. I also found another issue after fixing those. Throughout the codebase we check for the SR status as 'Open' or 'Closed', but I've found there's also a status of 'Pending'. A lot of the results that were missing from the export but appeared in the browser had the 'Pending' status. I'll submit an updated revision tonight or tomorrow with the fixes. |
This comment was marked as resolved.
This comment was marked as resolved.
Revise csv export to account for multiple years
Covert timestamps to human-readable date in csv export
f88cdf2
to
c6396b7
Compare
Will complete review before Thurs 6/27 EOD |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The date range now includes the last date and the csv now populates if the range involve multiple years. Great job on these modifications.
Using the same test case
Boundaries United NEIGHBORHOODS NC
Date Range 06/14/2024 - 06/21/2024
Request Types Single Streetlight
The address 2336 S 4TH AVE, 90018 pin does not appear, but was included in the csv. I'm not sure what is going on, but everything else looks great. I think another issue is needed in order to look into this. @ryanfchase @Skydodle
Approving on the condition that another issue needs to be created for the pins not matching with the csv. (Unless the issue is related to the code and I couldn't pinpoint it)
I think the pin for 2334 S 4TH AVE, 90018 might be overlapping the pin for 2336 S 4TH AVE, 90018 and that's why we can't see it on the map but it's on the export. If that is the case, it would be ideal to have it viewable on the map in some way. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I checked on Google Map, 2334 S 4TH AVE and 2336 S 4TH AVE are actually in the same house same building. The request for 2336 was created before 2334. So it seem like in our pin rendering system, if different requests on the same building, it will only render pin for the latest request and older ones gets replaced. Or it could be overlapping as Kelly said, maybe it doesnt know how to display a second pin at a different spot on the same house.
2243 W 20TH ST also has two requests, and only the latest one is rendering a pin.
I think this bug should be address in another ticket with our pin rendering logic and not part of this issue. Everything else looks good. Will go ahead and approve. Thank you so much for your hard work @kdow !
Fixes #1746
Notes:
I had to change the table from requests to requests_2024 for the queries to work. Should this be left as requests or requests_2024 for now?
This implementation will only generate the SR count CSV if it meets the following conditions: exactly one SR type is selected, a NC is selected, and status is Open. Should any of those requirements be changed?
I've verified that the CSV is generated in the specified conditions, along with NeighborhoodData.csv that was previously implemented
main
branchAny questions? See the getting started guide