-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(server): importing items when it contains id #1284
Conversation
WalkthroughThe changes in this pull request primarily involve modifications to the handling of default values for specific field types in the Changes
Possibly related PRs
Poem
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
✅ Deploy Preview for reearth-cms canceled.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 6
🧹 Outside diff range and nitpick comments (4)
server/e2e/integration_asset_test.go (2)
13-24
: Add documentation and consider improving test coverage.The function would benefit from:
- Documentation explaining its purpose and parameters
- Parameter validation
- Test cases with different
skipDecompression
valuesConsider adding documentation and validation:
+// UploadAsset performs an asset upload request for testing purposes. +// Parameters: +// e: HTTP test expectation context +// pId: Project ID +// path: File path +// content: File content func UploadAsset(e *httpexpect.Expect, pId string, path string, content string) *httpexpect.Value { + if e == nil { + panic("httpexpect.Expect is required") + } + if pId == "" { + panic("project ID is required") + } res := e.POST("/api/projects/{projectId}/assets", pId).
Integration test coverage needed for asset upload endpoint
The codebase only contains unit tests for the
UploadAsset
functionality inserver/internal/infrastructure/fs/file_test.go
, but lacks integration/e2e tests for the API endpoint. This gap in test coverage should be addressed to ensure the complete upload flow works as expected.Key areas to test in the new integration test:
- Unauthorized access attempts
- Invalid file uploads
- Successful file upload with proper authorization
- Response structure validation
🔗 Analysis chain
Line range hint
27-99
: Consider adding upload asset integration tests.While GET and DELETE operations are well tested, there are no integration tests using the newly relocated
UploadAsset
function.Let's check if upload tests exist elsewhere:
Would you like me to help create integration tests for the asset upload functionality?
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Search for tests using asset upload functionality rg -l 'func.*Test.*Upload.*Asset' server/Length of output: 90
Script:
#!/bin/bash # Let's check the content of the found test file to understand if it's a unit test or integration test rg -A 10 'func.*Test.*Upload.*Asset' server/internal/infrastructure/fs/file_test.goLength of output: 395
Script:
#!/bin/bash # Let's also check if there are any other test files that might test asset upload functionality using different naming rg -l 'POST.*"/api/assets"' server/Length of output: 37
Script:
#!/bin/bash # Let's also check for any e2e tests related to asset upload rg -l 'e\.POST.*"/api/assets"' server/Length of output: 40
server/e2e/integration_item_import_test.go (2)
211-213
: Remove unused commented-out variableThe variable
jsonContent
is defined but commented out and not used elsewhere in the code. Consider removing it to eliminate clutter.
258-310
: Add error handling for API response validationsWhen creating items and verifying responses, it's good practice to check for errors explicitly. Adding assertions for unexpected statuses or error messages will make your tests more robust.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (5)
- server/e2e/gql_field_test.go (1 hunks)
- server/e2e/integration_asset_test.go (1 hunks)
- server/e2e/integration_item_import_test.go (1 hunks)
- server/e2e/integration_item_test.go (0 hunks)
- server/internal/adapter/integration/item_import.go (2 hunks)
💤 Files with no reviewable changes (1)
- server/e2e/integration_item_test.go
🔇 Additional comments (5)
server/e2e/integration_asset_test.go (1)
5-5
: LGTM: Import additions are appropriate.The new imports are necessary for the relocated
UploadAsset
function.Also applies to: 8-8
server/internal/adapter/integration/item_import.go (2)
210-210
: LGTM! Clear and correct field filtering.The condition correctly filters out nil values and the "id" field, which is handled separately. This change aligns well with the PR's objective of fixing item imports containing IDs.
Line range hint
169-210
: Verify impact on existing integrations and GeoJSON imports.The changes to ID field handling might affect existing integrations. Please ensure:
- Integration tests cover both regular JSON and GeoJSON imports.
- Existing clients are aware of the change from "Id" to "id".
Let's verify the test coverage:
✅ Verification successful
Integration tests adequately cover both JSON and GeoJSON imports
The test coverage analysis shows comprehensive testing for both formats:
TestIntegrationModelImportJSONWithGeoJsonInput
covers GeoJSON imports with geometry handlingTestIntegrationModelImportJSONWithJsonInput1
andTestIntegrationModelImportJSONWithJsonInput2
cover regular JSON imports- Tests include various scenarios like insert/upsert strategies and schema mutations
The change from "Id" to "id" is safe as the tests use the lowercase "id" format in their test data, indicating this is the expected format. The GeoJSON handling is also properly tested with complete feature objects including geometry and properties.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Check test coverage for item imports # Look for test cases covering both JSON and GeoJSON imports # Test: Search for JSON import test cases echo "=== JSON Import Tests ===" rg -A 5 "json.*import" --glob "*_test.go" # Test: Search for GeoJSON test cases echo "=== GeoJSON Import Tests ===" rg -A 5 "geojson.*import" --glob "*_test.go"Length of output: 218
Script:
#!/bin/bash # Let's try a broader search for import-related tests and implementation # Search for item import test files echo "=== Item Import Test Files ===" fd "item.*import.*test" --type f # Search for import test cases with different patterns echo -e "\n=== Import Test Cases ===" rg -l "TestItem.*Import" --type go # Look for actual test implementations echo -e "\n=== Import Test Implementations ===" rg -A 10 "func Test.*Import" --glob "*_test.go" # Check the implementation file for supported formats echo -e "\n=== Supported Import Formats ===" rg "format.*:=|format.*=" server/internal/adapter/integration/item_import.goLength of output: 5040
server/e2e/gql_field_test.go (2)
Line range hint
577-644
: LGTM! Test coverage looks good.The test thoroughly validates the clearing of default values across multiple field types (date, integer, select) and ensures they are properly converted to
nil
.
216-216
: LGTM! Verify the corresponding implementation change.The change to use
nil
as the default value for the date field aligns with the handling of other field types and appears to be part of fixing item imports.Let's verify if there's a corresponding implementation change:
✅ Verification successful
The change to use
nil
as default value is correct and consistent with the implementationThe server-side implementation in
field_datetime.go
shows that date fields don't have any special default value handling, and the GraphQL schema confirms that date fields can have an optionaldefaultValue
of typeAny
. The test's change to usenil
as the default value aligns with this implementation.🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Search for date field default value handling in the implementation rg -g '!*test*' -l 'defaultValue.*date'Length of output: 138
Script:
#!/bin/bash # Let's check the actual content of these files to understand the date field default value handling echo "=== web/src/components/molecules/Common/MultiValueField/index.tsx ===" rg -A 5 -B 5 'defaultValue.*date' 'web/src/components/molecules/Common/MultiValueField/index.tsx' echo -e "\n=== web/src/gql/graphql-client-api.tsx ===" rg -A 5 -B 5 'defaultValue.*date' 'web/src/gql/graphql-client-api.tsx' # Let's also check the server-side implementation since the test is in server/e2e echo -e "\n=== Server-side date field handling ===" rg -g '!*test*' 'type.*DateField' server/Length of output: 20699
Script:
#!/bin/bash # Let's check the server-side implementation for date field handling, particularly around default values echo "=== Server-side schema field date implementation ===" rg -g '!*test*' -A 10 'struct.*DateField' server/ # Also check for date field type definitions and implementations echo -e "\n=== Date field type definitions ===" rg -g '!*test*' -A 5 'DateField.*struct' server/ # And check for any default value handling for date fields echo -e "\n=== Date field default value handling ===" rg -g '!*test*' -A 5 'date.*default' server/Length of output: 450
Script:
#!/bin/bash # Let's try a different approach to find the date field implementation echo "=== Schema field type definitions ===" rg -g '!*test*' 'type SchemaField' server/ # Check for any field type implementations echo -e "\n=== Field type implementations ===" rg -g '!*test*' 'type.*Field.*interface' server/ # Look for date-related field handling in schema package echo -e "\n=== Date field handling in schema ===" fd -t f . server/pkg/schema -x rg -l 'date|Date'Length of output: 10391
Script:
#!/bin/bash # Let's examine the date field implementation in the schema package echo "=== Date field implementation ===" cat server/pkg/schema/field_datetime.go # Also check the GraphQL schema definition for date fields echo -e "\n=== GraphQL schema for date fields ===" rg -A 5 'type SchemaFieldDate' server/schemas/field.graphqlLength of output: 1120
"github.com/reearth/reearth-cms/server/internal/app" | ||
"github.com/reearth/reearth-cms/server/pkg/id" | ||
) | ||
|
||
func UploadAsset(e *httpexpect.Expect, pId string, path string, content string) *httpexpect.Value { | ||
res := e.POST("/api/projects/{projectId}/assets", pId). | ||
WithHeader("X-Reearth-Debug-User", uId1.String()). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Consider making the debug user configurable.
The debug user ID is hardcoded, which could make the function less flexible for different test scenarios.
Consider adding it as a parameter:
-func UploadAsset(e *httpexpect.Expect, pId string, path string, content string) *httpexpect.Value {
+func UploadAsset(e *httpexpect.Expect, pId string, path string, content string, debugUser string) *httpexpect.Value {
res := e.POST("/api/projects/{projectId}/assets", pId).
- WithHeader("X-Reearth-Debug-User", uId1.String()).
+ WithHeader("X-Reearth-Debug-User", debugUser).
Committable suggestion was skipped due to low confidence.
idStr, _ := o["id"].(string) | ||
iId = id.ItemIDFromRef(&idStr) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Improve error handling for ID extraction.
The current implementation has potential issues:
- The type assertion
o["id"].(string)
could panic if the value exists but isn't a string. - Errors from ID parsing are silently ignored.
Consider this safer implementation:
-idStr, _ := o["id"].(string)
-iId = id.ItemIDFromRef(&idStr)
+if idVal, ok := o["id"]; ok {
+ if idStr, ok := idVal.(string); ok {
+ iId = id.ItemIDFromRef(&idStr)
+ } else {
+ return nil, nil, rerror.NewE(i18n.T("invalid id format"))
+ }
+}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
idStr, _ := o["id"].(string) | |
iId = id.ItemIDFromRef(&idStr) | |
if idVal, ok := o["id"]; ok { | |
if idStr, ok := idVal.(string); ok { | |
iId = id.ItemIDFromRef(&idStr) | |
} else { | |
return nil, nil, rerror.NewE(i18n.T("invalid id format")) | |
} | |
} |
res := e.PUT("/api/models/{modelId}/import", mId). | ||
WithHeader("Origin", "https://example.com"). | ||
WithHeader("X-Reearth-Debug-User", uId1.String()). | ||
//WithHeader("Content-Type", "multipart/form-data"). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Remove commented-out code for setting Content-Type header
The line WithHeader("Content-Type", "multipart/form-data")
is commented out. Since the WithMultipart()
method automatically sets the appropriate Content-Type
header, you can remove this commented line to clean up the code.
// // strategy="insert" and mutateSchema=true | ||
// fileContent2 := `{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [139.28179282584915,36.58570985749664]}, "properties": {"text": "test2"}}]}` | ||
// res2 := IntegrationModelImport(e, mId, "geoJson", "insert", true, fids.geometryObjectFid, fileContent2) | ||
|
||
// // strategy="update" and mutateSchema=false | ||
// fileContent3 := `{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [139.28179282584915,36.58570985749664]}, "properties": {"text": "test2"}}]}` | ||
// res3 := IntegrationModelImport(e, mId, "geoJson", "update", false, fids.geometryObjectFid, fileContent3) | ||
|
||
// // strategy="update" and mutateSchema=true | ||
// fileContent4 := `{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [139.28179282584915,36.58570985749664]}, "properties": {"text": "test2"}}]}` | ||
// res4 := IntegrationModelImport(e, mId, "geoJson", "update", true, fids.geometryObjectFid, fileContent4) | ||
|
||
// // strategy="upsert" and mutateSchema=false | ||
// fileContent5 := `{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [139.28179282584915,36.58570985749664]}, "properties": {"text": "test2"}}]}` | ||
// res5 := IntegrationModelImport(e, mId, "geoJson", "upsert", false, fids.geometryObjectFid, fileContent5) | ||
|
||
// // strategy="upsert" and mutateSchema=true | ||
// fileContent6 := `{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [139.28179282584915,36.58570985749664]}, "properties": {"text": "test2"}}]}` | ||
// res6 := IntegrationModelImport(e, mId, "geoJson", "upsert", true, fids.geometryObjectFid, fileContent6) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Clean up commented-out test cases
There are several test cases commented out in this section. If these tests are no longer needed, consider removing them to improve code readability. If they are intended for future use, you might want to track them separately or add a TODO comment explaining when they should be reintroduced.
// // strategy="insert" and mutateSchema=true | ||
// fileContent2 := `{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [239.28179282584915,36.58570985749664]}, "properties": {"text": "test2"}}]}` | ||
// id2 := UploadAsset(e, pId, "./test2.geojson", fileContent2).Object().Value("id").String().Raw() | ||
// res2 := IntegrationModelImportJSON(e, mId, id2, "geoJson", "insert", true, fids.geometryObjectFid) | ||
|
||
// // strategy="update" and mutateSchema=false | ||
// fileContent3 := `{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [139.28179282584915,36.58570985749664]}, "properties": {"text": "test2"}}]}` | ||
// id3 := UploadAsset(e, pId, "./test3.geojson", fileContent3).Object().Value("id").String().Raw() | ||
// res3 := IntegrationModelImportJSON(e, mId, id3, "geoJson", "update", false, fids.geometryObjectFid) | ||
|
||
// // strategy="update" and mutateSchema=true | ||
// fileContent4 := `{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [139.28179282584915,36.58570985749664]}, "properties": {"text": "test2"}}]}` | ||
// id4 := UploadAsset(e, pId, "./test4.geojson", fileContent4).Object().Value("id").String().Raw() | ||
// res4 := IntegrationModelImportJSON(e, mId, id4, "geoJson", "update", true, fids.geometryObjectFid) | ||
|
||
// // strategy="upsert" and mutateSchema=false | ||
// fileContent5 := `{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [139.28179282584915,36.58570985749664]}, "properties": {"text": "test2"}}]}` | ||
// id5 := UploadAsset(e, pId, "./test5.geojson", fileContent5).Object().Value("id").String().Raw() | ||
// res5 := IntegrationModelImportJSON(e, mId, id5, "geoJson", "upsert", false, fids.geometryObjectFid) | ||
|
||
// // strategy="upsert" and mutateSchema=true | ||
// fileContent6 := `{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [139.28179282584915,36.58570985749664]}, "properties": {"text": "test2"}}]}` | ||
// id6 := UploadAsset(e, pId, "./test6.geojson", fileContent6).Object().Value("id").String().Raw() | ||
// res6 := IntegrationModelImportJSON(e, mId, id6, "geoJson", "upsert", true, fids.geometryObjectFid) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Clean up commented-out test cases
Similar to the previous comment, this block contains commented-out tests. Removing unused code can make the test suite clearer and more maintainable.
jsonContent := `[{"id": "` + iId + `","text": "test1", "bool": true, "number": 1.1},{"text": "test2", "bool": false, "number": 2},{"text": "test3", "bool": null, "number": null}]` | ||
aId := UploadAsset(e, pId, "./test1.json", jsonContent).Object().Value("id").String().Raw() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Safely construct JSON content to prevent potential injection issues
The jsonContent
string is constructed using string concatenation with iId
. If iId
contains unexpected characters, it could lead to JSON formatting issues or injection vulnerabilities. It's safer to use JSON marshalling to construct the content.
Apply this diff to construct the JSON content safely:
-jsonContent := `[{"id": "` + iId + `","text": "test1", "bool": true, "number": 1.1},{"text": "test2", "bool": false, "number": 2},{"text": "test3", "bool": null, "number": null}]`
+items := []map[string]interface{}{
+ {"id": iId, "text": "test1", "bool": true, "number": 1.1},
+ {"text": "test2", "bool": false, "number": 2},
+ {"text": "test3", "bool": nil, "number": nil},
+}
+jsonBytes, err := json.Marshal(items)
+if err != nil {
+ t.Fatalf("Failed to marshal JSON: %v", err)
+}
+jsonContent := string(jsonBytes)
Ensure you have imported the encoding/json
package:
import (
"fmt"
"net/http"
"strings"
"testing"
+ "encoding/json"
"github.com/gavv/httpexpect/v2"
"github.com/reearth/reearth-cms/server/internal/app"
)
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
jsonContent := `[{"id": "` + iId + `","text": "test1", "bool": true, "number": 1.1},{"text": "test2", "bool": false, "number": 2},{"text": "test3", "bool": null, "number": null}]` | |
aId := UploadAsset(e, pId, "./test1.json", jsonContent).Object().Value("id").String().Raw() | |
items := []map[string]interface{}{ | |
{"id": iId, "text": "test1", "bool": true, "number": 1.1}, | |
{"text": "test2", "bool": false, "number": 2}, | |
{"text": "test3", "bool": nil, "number": nil}, | |
} | |
jsonBytes, err := json.Marshal(items) | |
if err != nil { | |
t.Fatalf("Failed to marshal JSON: %v", err) | |
} | |
jsonContent := string(jsonBytes) | |
aId := UploadAsset(e, pId, "./test1.json", jsonContent).Object().Value("id").String().Raw() |
Summary by CodeRabbit
New Features
Bug Fixes
Refactor
Chores