Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

doc: pyspark write example #55

Merged
merged 8 commits into from
Aug 24, 2022
Merged

doc: pyspark write example #55

merged 8 commits into from
Aug 24, 2022

Conversation

wey-gu
Copy link
Contributor

@wey-gu wey-gu commented Aug 23, 2022

as titled

@codecov-commenter
Copy link

codecov-commenter commented Aug 23, 2022

Codecov Report

Merging #55 (3706c47) into master (6b80a2f) will not change coverage.
The diff coverage is n/a.

@@            Coverage Diff            @@
##             master      #55   +/-   ##
=========================================
  Coverage     74.09%   74.09%           
  Complexity      153      153           
=========================================
  Files            24       24           
  Lines          1594     1594           
  Branches        256      256           
=========================================
  Hits           1181     1181           
  Misses          266      266           
  Partials        147      147           

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

README.md Outdated Show resolved Hide resolved
"graphAddress", "graphd1:9669").option(
"passwd", "nebula").option(
"writeMode", "delete").option(
"user", "root").save()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

well done! great example!

val RANK_AS_PROP: String = "rankAsProp"
val WRITE_MODE: String = "writeMode"
val DELETE_EDGE: String = "deleteEdge"
```
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we post the option's datatype here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean we should add things like bool for deleteEdge or? Should we expose those options here or not?

Copy link
Contributor Author

@wey-gu wey-gu Aug 24, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could add comment inline on supported value/type like(assuming # is a valid comment string for scala)?

  val RANK_AS_PROP: String = "rankAsProp" # bool, i.e. true
  val WRITE_MODE: String   = "writeMode"  # string, i.e. 'update', 'insert', 'delete'
  val DELETE_EDGE: String  = "deleteEdge" # bool, i.e. true

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could add comment inline on supported value/type like(assuming # is a valid comment string for scala)?

  val RANK_AS_PROP: String = "rankAsProp" # bool, i.e. true
  val WRITE_MODE: String   = "writeMode"  # string, i.e. 'update', 'insert', 'delete'
  val DELETE_EDGE: String  = "deleteEdge" # bool, i.e. true

Yeah, great way. The information for options is enough now.

Copy link
Contributor

@Nicole00 Nicole00 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@Nicole00 Nicole00 merged commit dd535bf into master Aug 24, 2022
@wey-gu wey-gu deleted the wey-gu-pyspark-write-example branch August 24, 2022 03:07
Nicole00 added a commit that referenced this pull request Aug 31, 2022
* pyspark example added (#51)

* pyspark example added

* Update README_CN.md

* support delete related edges when delete vertex (#53)

* support delete related edges when delete vertex

* add test

* add example for delete vertex with edge (#54)

* doc: pyspark write example (#55)

* doc: pyspark write example

* Added pyshell calling lines and python file header

discussed in #50
Thanks to @Reid00

* Update README.md

wording

* Update README_CN.md

* Update README.md

* Update README_CN.md

* Update README.md

* Update README_CN.md

* spark2.2 reader initial commit

* spark2.2 reader initial commit

* extract common config for multi spark version

* delete common config files

* extract common config and utils

* remove common test

* spark connector reader for spark 2.2

* spark connector writer for spark 2.2

* revert example

* refactor spark version & close metaProvider after finish writing

* refactor common package name

* fix scan part

* refactor spark version for spark2.2

* connector writer for spark2.2

Co-authored-by: Wey Gu <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants