-
Notifications
You must be signed in to change notification settings - Fork 669
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactor initialize method #675
Merged
stu1130
merged 1 commit into
deepjavalibrary:block_improvement
from
stu1130:make_xavier
Feb 24, 2021
Merged
Refactor initialize method #675
stu1130
merged 1 commit into
deepjavalibrary:block_improvement
from
stu1130:make_xavier
Feb 24, 2021
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
zachgk
approved these changes
Feb 24, 2021
stu1130
added a commit
that referenced
this pull request
Feb 25, 2021
stu1130
added a commit
to stu1130/djl
that referenced
this pull request
Feb 26, 2021
stu1130
added a commit
that referenced
this pull request
Feb 26, 2021
stu1130
added a commit
to stu1130/djl
that referenced
this pull request
Mar 1, 2021
stu1130
added a commit
that referenced
this pull request
Mar 2, 2021
zachgk
added a commit
that referenced
this pull request
Mar 11, 2021
* This creates the component which will populate the Download Tab with Download Buttons. * Making a place for the download buttons. * Adding the Model Download Handler allowing the backend to feed the links into the Model View and making slight changes for readablity. * Getting rid of some of the test code. * Improve Block usability (#712) * Use builder pattern for Parameter (#661) * Make XavierInitializer default value & Improve setInitializer (#664) * Refactor initialize (#675) * Remove NDManager on getOutputShapes (#710) * Removing unnecessary logging messages. * block factory init commit (#697) * [DOCS] Fixing TrainingListener documentation (#718) * Fixing TrainingListener documentation * Fixing PR reviews * Fix DJL serving flaky test for mac (#721) Change-Id: I9eccc84b0c34652e50c5fe5a4fe42f2b82d65a3d * Fixing all of the nits. * Getting rid of unnecessary methods. * update onnxruntime along with String tensor (#724) * Add profiler doc (#722) * Resolving some comments. * Using a better criteria incase multiple models have the same name. * Fixing the java doc. * Configure verbose of mxnet extra libraries (#728) Change-Id: I66d54aa496cccbb9e8c0a89eeaa458605958d9c6 * Added a TODO for using the artifact repo to get the base uri. * paddlepaddle CN notebook (#730) * paddlepaddle CN notebook * install font Change-Id: I2d749e617b0bf78ecbcd168b82c53a1fab49a2c0 * refactor on name Change-Id: I9e379eee51ceae16391850b3ba9782acb04c4021 * Refine the text Co-authored-by: gstu1130 <[email protected]> * add EI documentation (#733) * add EI documentation * fix pmd rules Change-Id: Ieee5577c26f6df2843781f8f9180de35069a5de3 * allow pytorch stream model loading (#729) * allow pytorch stream model loading * updates Change-Id: Ibc26261b90de673712e90de0d640a8f32f23763e * add NDList decode from inputStream (#734) Change-Id: I6a31d8b0b955f2dbb762220b101e3928a34699c1 * Remove memory scope and improve memory management (#695) The MemoryScope reveals a number of shortcomings within the DJL memory management. While the MemoryScope is deleted, many of them are fixed as part of this PR. First, the NDManager.{attach, detach} were renamed to xxxInternal. This is to differentiate them from the attach and detach methods that are intended to be used. There are two new concepts in memory management. An NDResource interface was created to combine the concepts of managed memory that was used in NDArray and NDList. It could also be used in more classes in the future. This includes the getManager, attach, and detach. Within the NDManager, it gains a second "management convention". The first convention of normal resources are added to the manager and then closed when the manager closes. This works for small numbers of things on the NDArray, but not when operations transitively create. So, the second convention is a tempResource. Instead of freeing them when the manager is closed, they are returned to their original manager. This is used to create a temporary scope, do operations within it, and then the inputs and return value are returned to the parent while the intermediate work is cleaned. This also matches the concepts of ownership/borrowing as well. Using these, a few additional helper methods were created. There is `NDManager.from(resource)` to ease creation of managers based on a resource. There is also `scopeManager.ret(returnValue)` to help with returning values outside of the scopeManager. Lastly, there is a `scopeManager.{temp,}AttachAll` to attach a number of resources to a manager within a single call. Using these improvements, the new method were applied to the old locations where MemoryScope was used as well as an additional case in NDManagerEx. Also, the old attach methods were altered to be `void`. Because the return values are no longer used anywhere and are not as necessary in the current scheme, I figured it would simplify things. It also helps for things like `NDList.attach` which does not have a single original NDManager when attaching. Change-Id: I91d109cd14d70fa64fd8fffa0b50d88ab053013e * Remove erroneous random forest application (#726) The application was changed to the more accurate softmax_regression (matching the terminology from the D2L book). Change-Id: I1f69f005bbe38b125f2709c2988d06c14eebb765 * Minor fixes on duplicated code (#736) * remove methods that already defined in the NDArrayAdapter Change-Id: I01cc03a7f5b427bf31c6b3fd8d2136f2a27fe93b * refactor toString Change-Id: Iea22b16e1daa9f759b55c1a8b8b85536482e551a * remove sparse NDArray Change-Id: Icb44096519775f54cb32cc768c14f49e33dc7ea5 * fix test Change-Id: Icef580ed77e7bba22864ce44577de3cba51e3e41 Co-authored-by: Jake Lee <[email protected]> Co-authored-by: Lanking <[email protected]> Co-authored-by: aksrajvanshi <[email protected]> Co-authored-by: Frank Liu <[email protected]> Co-authored-by: Zach Kimberg <[email protected]>
Lokiiiiii
pushed a commit
to Lokiiiiii/djl
that referenced
this pull request
Oct 10, 2023
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Refactor the way to build the block. Reference: TensorFlow
To split the PR into smaller PRs, now we still leverage getOutputs. I will address this in next PR